0FI_AR_4  datasource is extraction 0 records during Init

Hi,
0FI_AR_4  datasource is extraction 0 records during Init. But if I run the Full-load I can get the data.
I am not sure whether our extractors are uncoupled. 
Currently we are using GL and AP extractors as well.
We are on EEC6 and PI_BASIS 2005_1_700
There were no selection fields defined 0FI_AR_4 to restrict the Init on Fiscal year period/ Company code.
Not sure whether this note 551044 is applicable for the currect vesion of the component.
Reagards
Ajay
Points will be assigned to all answers.

Hi Anil,
Thanks this is useful I have checked TPS31 Table the entry is maintained as described in your link.
this table doesn't have any entries in it BWOM_SETTINGS ( Do we have entries only after running delta infopackage?
When I tey to instral the datasource from RSA5
The get the following error message.
Units field HWAE2 for field H2STE of DataSource 0FI_AR_4 is hidden
Units field HWAE3 for field H3STE of DataSource 0FI_AR_4 is hidden
Units field HWAE2 for field ZLOCAL2 of DataSource 0FI_AR_4 is hidden
Units field HWAE3 for field ZLOCAL3 of DataSource 0FI_AR_4 is hidden
it will be great if you can help me out.
Regards
Ajay

Similar Messages

  • 0FI_AR_4 Initialization - millions of records

    Hi,
    We are planning to initialize 0FI_AR_4 datasource for which there are millions of records available in Source system.
    While checking in Quality system we have realised that just for a single fiscal period it is taking hours to extract data, and in Production system we have data for last 4 years (about 40 million records).
    The trace results (ST05) say that most of the time taken while fetching data from BKPF_BSID / BKPF_BSAD view.
    I can see index available on tables BSID/BSAD - Index 5 - Index for BW extraction - which is not yet created on database.
    This index has 2 fields - BUKRS & CPUDT.
    I am not sure whether this index will help in extracting data.
    What all things can be done to improve the performance of this extraction so that Initialization of 0FI_AR_4 can be completed within optimum time.
    Appreciate your inputs experts.
    Regards,
    Vikram.

    We are planning to change the existing FI_AR line item load from current fiscal year full to delta. As of now the FI_AR_4 is full from R/3 for certain company codes and fiscal Yr/Period 2013001 - 2013012. Now business wants historical data and going forward the extractor should bring only changes ( delta).
    we would like to perform these below steps
    1. Initialisation w/o data transfer on comp_code and FY/period 1998001 - 9999012
    2. Reapir full load for all the historical data fiscal year/period wise like 1998001-1998012, 1999001-1999012...... current year 2013001 - 2013011 till PSA
    3. Load these to DSO
    4. activate the requests
    5. Now do a delta load from R/3 to BW till PSA for the new selection 1998001-9999012
    6. load till DSO
    7. Activate the load
    Pls let me know if these above steps will bring in all the data for FI_AR_4 line items and will not be missing any data once I do the delta load after the repair full loads.
    Thanks

  • 0FI_AR_4 Datasource, Delta

    Hi Experts,
    we are using 0FI_AR_4 datasource, this is delta enable, but the problem is we can run delta just once a day.
    Can any one please let me know how to change this so that i can run the delta more than once a day.
    Any document or a link would be of great help.
    Thanks in advance.
    Ananth

    hi Ananth,
    take a look Note 991429 - Minute Based extraction enhancement for 0FI_*_4 extractors
    https://websmp203.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=991429&_NLANG=E
    Symptom
    You would like to implement a 'minute based' extraction logic for the data sources 0FI_GL_4, 0FI_AR_4 and 0FI_AP_4.
    Currently the extraction logic allows only for an extraction once per day without overlap.
    Other terms
    general ledger  0FI_GL_4  0FI_AP_4  0FI_AR_4  extraction  performance
    Reason and Prerequisites
    1. There is huge volume of data to be extracted on a daily basis from FI to BW and this requires lot of time.
    2. You would like to extract the data at a more frequent intervals in a day like 3-4 times in a day - without extracting all the data that you have already extracted on that day.
    In situations where there is a huge volume of data to be extracted, a lot of time is taken up when extracting on a daily basis. Minute based extraction would enable the extraction to be split into convenient intervals and can be run multiple times during a day. By doing so, the amount of data in each extraction would be reduced and hence the extraction can be done more effectively. This should also reduce the risk of extractor failures caused because of huge data in the system.
    Solution
    Implement the relevant source code changes and follow the instructions in order to enable minute based extraction logic for the extraction of GL data. The applicable data sources are:
                            0FI_GL_4
                            0FI_AR_4
                            0FI_AP_4
    All changes below have to be implemented first in a standard test system. The new extractor logic must be tested very carefully before it can be used in a production environment. Test cases must include all relevant processes that would be used/carried in the normal course of extraction.
    Manual changes are to be carried out before the source code changes in the correction instructions of this note.
    1. Manual changes
    a) Add the following parameters to the table BWOM_SETTINGS
                             MANDT  OLTPSOURCE    PARAM_NAME          PARAM_VALUE
                             XXX                  BWFINEXT
                             XXX                  BWFINSAF            3600
                  Note: XXX refers to the specific client(like 300) under use/test.
                  This can be achieved using using transaction 'SE16' for table
                             'BWOM_SETTINGS'
                              Menue --> Table Entry --> Create
                              --> Add the above two parameters one after another
    b) To the views BKPF_BSAD, BKPF_BSAK, BKPF_BSID, BKPF_BSIK
                           under the view fields add the below field,
                           View Field  Table    Field      Data Element  DType  Length
                           CPUTM       BKPF    CPUTM          CPUTM      TIMS   6
                           This can be achieved using transaction 'SE11' for views
                           BKPF_BSAD, BKPF_BSAK , BKPF_BSID , BKPF_BSIK (one after another)
                               --> Change --> View Fields
                               --> Add the above mentioned field with exact details
    c) For the table BWFI_AEDAT index-1  for extractors
                           add the field AETIM (apart from the existing MANDT, BUKRS, and AEDAT)
                           and activate this Non Unique index on all database systems (or at least on the database under use).
                           This can achived using transaction 'SE11' for table 'BWFI_AEDAT'
                               --> Display --> Indexes --> Index-1 For extractors
                               --> Change
                               --> Add the field AETIM to the last position (after AEDAT field )
                               --> Activate the index on database
    2. Implement the source code changes as in the note correction instructions.
    3. After implementing the source code changes using SNOTE instruction ,add the following parameters to respective function modules and activate.
    a) Function Module: BWFIT_GET_TIMESTAMPS
                        1. Export Parameter
                        a. Parameter Name  : E_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
                        2. Export Parameter
                        a. Parameter Name  : E_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
    b) Function Module: BWFIT_UPDATE_TIMESTAMPS
                        1. Import Parameter (add after I_DATE_HIGH)
                        a. Parameter Name  : I_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
                        2. Import Parameter (add after I_TIME_LOW)
                        a. Parameter Name  : I_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
    4. Working of minute based extraction logic:
                  The minute based extraction works considering the time to select the data (apart from date of the document either changed or new as in the earlier logic).The modification to the code is made such that it will consider the new flags in the BWOM_SETTINGS table ( BWFINEXT and BWFINSAF ) and the code for the earlier extraction logic will remain as it was without these flags being set as per the instructions for new logic to be used(but are modified to include new logic).
    Safety interval will now depend on the flag BWFINSAF (in seconds ; default 3600) and has  a default value of 3600 (1 hour), which would try to ensure that the documents which are delayed in posting due to delay in update modules for any reason. Also there is a specific coding to post an entry to BWFI_AEDAT with the details of the document which have failed to post within the safety limit of 1 hour and hence those would be extracted as a changed documents at least if they were missed to be extracted as new documents. If the documents which fail to ensure to post within safety limit is a huge number then the volume of BWFI_AEDAT would increase correspondingly.
    The flag BWFINSAF could be set to particular value depending on specific requirements (in seconds , but at least 3600 = 1 hour)  like 24 hours / 1 day = 24 * 3600 => 86400.With the new logic switched ON with flag BWFINEXT = X the other flags  BWFIOVERLA , BWFISAFETY , BWFITIMBOR are ignored and BWFILOWLIM , DELTIMEST will work as before.
    As per the instructions above the index-1 for the extraction in table BWFI_AEDAT would include the field AETIM which would enable the new logic to extract faster as AETIM is also considered as per the new logic. This could be removed if the standard logic is restored back.
    With the new extractor logic implemented you can change back to the standard logic any day by switching off the flag BWFINEXT to ' ' from 'X' and extract as it was before. But ensure that there is no extraction running (for any of the extractors 0FI_*_4 extractors/datasources) while switching.
    As with the earlier logic to restore back to the previous timestamp in BWOM2_TIMEST table to get the data from previous extraction LAST_TS could be set to the previous extraction timestamp when there are no current extractions running for that particular extractor or datasouce.
    With the frequency of the extraction increased (say 3 times a day) the volume of the data being extracted with each extraction would decrease and hence extractor would take lesser time.
    You should optimize the interval of time for the extractor runs by testing the best suitable intervals for optimal performance. We would not be able to give a definite suggestion on this, as it would vary from system to system and would depend on the data volume in the system, number of postings done everyday and other variable factors.
    To turn on the New Logic BWFINEXT has to be set to 'X' and reset back to ' ' when reverting back. This change has to be done only when there no extractions are running considering all the points above.
                  With the new minute based extraction logic switched ON,
    a) Ensure BWFI_AEDAT index-1 is enhanced with addition of AETIM and is active on the database.
    b) Ensure BWFINSAF is atleast 3600 ( 1 hour) in BWOM_SETTINGS
    c) Optimum value of DELTIMEST is maintained as needed (recommended/default value is 60 )
    d) A proper testing (functional, performance) is performed in standard test system and the results are all positive before moving the changes to the production system with the test system being an identical with the production system with settings and data.
    http://help.sap.com/saphelp_bw33/helpdata/en/af/16533bbb15b762e10000000a114084/frameset.htm

  • Delta update for 0FI_AR_4 DataSource

    Hi Experts,
    We have a couple of reports (Infoprovider ODS) which are based on the 0FI_AR_4 DataSource. We have enhanced the Datasource and added 7 to 8 fields from SD and FI region. For enabling delta current mapping is: UPDMOD (R/3) field mapped to 0RECORDMODE (BW).
    One of the enhanced FI field is Collection Specialist. If this Collection Specialist Master record is changed for a particular Customer, all the newly generated  Accounting Document shows the current Collection specialist but the already existing Accounting Document in BW are not getting converted to the current Collection Specialist.
    Enhancement Code for 2 fields:
    1. SELECT SINGLE KLIMK FROM KNKK INTO V_KLIMK
             WHERE KUNNR = l_s_dtfiar_3-KUNNR AND KKBER = l_s_dtfiar_3-KKBER.
         IF SY-SUBRC = 0.
             l_s_dtfiar_3-ZZKLIMK = V_KLIMK.
         ENDIF.
    2. CLEAR: WA_UDMBPSEGMENTS.
        SELECT * FROM UDMBPSEGMENTS INTO WA_UDMBPSEGMENTS
                                   WHERE PARTNER = l_s_dtfiar_3-KUNNR.
        ENDSELECT.
        IF SY-SUBRC = 0.
            SELECT * FROM UDM_SGMT_COMP INTO TABLE IT_COMP_CODE
                                   WHERE COLL_SEGMENT = WA_UDMBPSEGMENTS-COLL_SEGMENT.
            LOOP AT IT_COMP_CODE.
              IF IT_COMP_CODE-COMP_CODE = l_s_dtfiar_3-BUKRS.
                l_s_dtfiar_3-ZZCOLL_SPECL = WA_UDMBPSEGMENTS-COLL_SPECIALIST.
              ENDIF.
            ENDLOOP.
         ENDIF.
         CLEAR   : IT_COMP_CODE
    Please suggest what can be done to resolve the issue.
    Thanks,
    Vivek

    Hi,
    I hope you have written the codes in R/3 side.And whenever the Collection Specialist Master record is changed for a particular Customer, all the newly generated Accounting Document show the current Collection specialist but the already existing Accounting Document in BW will not converted to the current Collection Specialist since the data in BW is old data and no logic is written to convert the same.
    Try to incorporate the logic to convert in BW also.
    I could suggest is whenever there is change in Collection Specialist Master record , try to trigger a delat record generation that updates the old accounting doc withnew master in the delta load.
    Ramesh

  • Error in extracting records for Loan Management

    Hi
    I am trying to extract records for Loan Management(CFM and CML), but I
    am getting Error .
    During the extraction phase an error occurred for the extractor. An exception, 'error_passed_to_mess_handler' was triggered.
    Following are the extractors I am trying
    0CFM_INIT_POSITIONS
    0CFM_DELTA_POSITIONS
    0CML_CASHFLOW
    0CML_INIT_BUDAT
    0CML_INIT_DFAEL
    0CML_DELTA_CAP
    We faced similar problem in Development then change the settings like in customization
    " Define InfoSources for Position Initialization " inside BW/CML
    customizing. changed the date there to 31.12.2006
    and activated the change pointers and transported the chages to R/3
    Production ,
    but in production it is not working
    Pl let me know If some additional settings are required

    Hi,
    if you run the extractors in the cml-system via rsa3, check out the log. What is the exact message? Is it just tho one you already gave or is there some more information? If you run them in cml, don't forget to specify the target system.
    regards
    Siggi

  • CRM Datasource - no extraction happening

    Folks,
    I am using CRM Standard Datasource 0CMS_RTCM_TRAN-Resale Tracking and Claims Management data..
    I have enhanced this Datasource with four fields and transported to Quality system..
    no problem with the new additions..Datasource is extracting properly..
    but Delta update is not working..Delta Queue is not picking any records..but new transactions are getting updated in the respective source tables..
    when i checked in BWA1- am getting these messages:
    error message:
    (121) MS_RTCM_TRAN: Table SMOXHEAD contains no BDOC name.
    Warning messages:
    (111) MS_RTCM_TRAN: No mapping found for extract field SLSR_ADDRESS_I..similarily for all the Datasource fields, its showing the same no mapping warning..
    Currently no data is been extracted and updated to PSA..
    Can you please explain what could be the reason for this and suggest a proper solution..
    Are there any transactions to change the BDOC name for Standard Datasources??..
    In BWA1, when i tried to change these ADAPTER settings, i am getting this message:
    "DataSource 0CMS_RTCM_TRAN cannot be edited with this transaction"..
    i hope if i specify the BDOC name, Extraction should be happening..
    so please suggest where i can perform these changes??..
    Note : currently no extraction is happening because of this..
    Thanks and Regards,
    Yog

    Hi All
    We are facing same issue, can anyone let me know what is BADI for this Data source 0CMS_RTCM_TRAN, I need BADI name where fields are mapped for this data source for delta update, we are unable to extract data through delta update but full upload is working
    Thanks in advance
    Dhanu

  • Extract records to Desktop in flat file

    How to Extract records to Desktop in flat file.
    I am not able to do it from Syndicator,i tried many options.Some one can help.
    I want to extract the records from MDM to Desktop in Excel/Flat file format,whether it is from Data Manage or Syndicator.
    Immediate req.

    Hi Shifali,
    I want to extract the records from MDM to Desktop in Excel/Flat file format,whether it is from Data Manage or Syndicator
    - MDM syndicator is the tool to export data in MDM .
    - Data in MDM repository can be viewed and managed in MDM Data manager.
    - What ever data is present in data manager will be exported outside using MDM syndicator.using your search criteria.
    How to Extract records to Desktop in flat file.
    I am not able to do it from Syndicator,i tried many options.Some one can help.
    - Master data in MDM rep can be syndicated using MDM syndicator using 2 main formats Flat and XML.
    - For local syndication to your desktop in flat format you need to follow the bewlo steps:
    1) Create destination items manually in Syndicator under destination item tab.
    2) Map this destination item to the source fields whcih are your mdm rep fields.
    3) Use a serach critera if you wnat to filter records.
    4) Click the export button in the syndicator
    5) It will ask for the output file name and format
    6) Give a name and select teh format form dropdown in the window (it can be text,excel.access any thing)
    7) You output file is syndiacted to your local desktop.
    Follow the below link for Flat syndication:
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/60ebecdf-4ebe-2a10-cf9f-830906c73866 (Flat Syndication)
    Follow the below link for XML syndication:
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e04fbbb3-6fad-2a10-3699-fbb40e51ad79  (XML Syndication)
    This will give you the clear distinction between the two.
    Hope It Helped
    Thanks & Regards
    Simona Pinto

  • [Solved] 'New' error messages during init after updating initscripts

    After updating to the latest initscripts, I got some error messages during init that I hadn't seen before.  They occurred during one of the UDev steps, and were caused by an attempt to load an IDE module despite the controller being disabled.  The controller shows up in Windows devmgr too, so I guess it's a failing of the mobo or BIOS.
    In any case, I downgraded and found that the errors were nothing new.  I'm just curious if the appearance of the errors during init was by design or a happy accident.
    Last edited by alphaniner (2011-08-01 17:07:30)

    alphaniner wrote:
    karol wrote:If you have the new (i.e. current) initscripts, you should be fine:
    http://projects.archlinux.org/initscrip … 328fbaf2e7
    http://projects.archlinux.org/initscrip … 42dfeca8c2
    I had never noticed the file before yesterday, actually. And I'm up-to-date, yet it's still full of 'em.
    You need to reboot.
    After you do, run
    tail /var/log/boot
    Mine isn't perfect, but I've already sent an e-mail to the dev (it's a matter of just a sed oneliner)
    [karol@black ~]$ tail /var/log/boot
    Tue Aug 2 05:03:01 2011: :: Setting Hostname: black ^[[171G [BUSY] ^[[171G [DONE]
    Tue Aug 2 05:03:01 2011: :: Setting Locale: pl_PL.UTF-8 ^[[171G [BUSY] ^[[171G [DONE]
    Tue Aug 2 05:03:01 2011: :: Setting Consoles to UTF-8 mode ^[[171G [BUSY] ^[[171G [DONE]
    Tue Aug 2 05:03:01 2011: :: Loading Keyboard Map: pl ^[[171G [BUSY] ^[[171G [DONE]
    Tue Aug 2 05:03:02 2011: :: Loading Console Font: Lat2-Terminus16 ^[[171G [BUSY] ^[[171G [DONE]
    Tue Aug 2 05:03:02 2011: :: Saving dmesg Log ^[[171G [BUSY] ^[[171G [DONE]
    Tue Aug 2 05:03:02 2011: INIT: Entering runlevel: 3
    Tue Aug 2 05:03:02 2011: :: Starting Syslog-NG ^[[171G [BUSY] ^[[171G [DONE]
    Tue Aug 2 05:03:03 2011: :: Starting D-BUS system messagebus ^[[171G [BUSY] ^[[171G [DONE]
    Tue Aug 2 05:03:04 2011: :: Starting acpid ^[[171G [BKGD] :: Starting gpm ^[[171G [BKGD^[[0;1

  • DataSources and extract structures generation

    Hi,
    How would I generate the DataSources and extract structures using transaction LBWE? It seems that there are no push buttons to say to generate the DataSources and extract structures from LBWE? Would clicking on the DataSources and extract structures and click on the continue button be enough?
    Thanks

    I check from LBWE maintenance structure column, and the fields are there on the left hand side for the selection criteria.
    How would I check "..<i>fields you have enhaced are ready to use</i>...." from LBWE.
    From RSA6, I would doubleclick on the LIS datasource, and I see no enhanced fields and therefore could not see if they are hidden.
    "<i>Did you check in LBWE if your strcutures have the fields ready to use?" </i>How would I do this? Maybe this is the problem.
    Thanks
    Message was edited by:
            NoNoNo ...

  • Datasources and Extract structure

    Hi guys,
    I am pretty confused about DataSources and Extract structure.
    Can someone please explain it me in simple words.

    Hi Zubin,
    Data Source is a consoliated list of fields available and the extract structure is a data dictionary structure which illustrates the fields with additional technical elements like DATA ELEMENT , DOMAIN ans so on...
    To make it more clear look at this description from F4 help:
    A DataSource is an object for retrieving data. The DataSource is localized in the OLTP system.
    It has
    an extract structure,
    an extraction type, and
    an extraction method.
    The extract structure describes the fields of the internal table that contain the extracted data. The extraction type describes which type of extractor the DataSource uses. The extraction method extracts the data and transfers it into an internal table with the same type as the extract structure.
    The DataSource also contains information on the type of data that it stages, for example, attributes, transactional data, hierachies, or texts. It can also support different types of data update.
    Extract Structure for a DataSource
    The extraction structure for a data source shows the format in which the DataSource, or the extractor for the DataSource, transfers its data.
    A data element must be assigned to each field of the extraction structure. This allows in the Business Information Warehouse an intelligent mapping between field names and InfoObjects using just this data element.
    The extract structure must be created in the DDIC as a dictionary structure or transparent table. A view is not permitted here since it would then not give you the option to add an append.
    Appends enable you to convert your individual requirements and own "Business Logic" in the extraction process. You can fill the fields in the append using function enhancements.
    Hope this helps
    Thanks,
    Raj

  • Error in OLTP 2.0 extractor during init. selection

    During 0FIAR_O03 Init package load on PRD system I got thousands of errors with the same message:
    *Error in OLTP 2.0 extractor during init. selection
    What's wrong with these selections?
    My data load is 3.x type
    I'm loading into BI for SAP Netweaver 2004s

    I get error while trying INIT LOAD by RSA3:
    "Could not determine BW release of logical system ''
    I get no errors if i'm trying FULL LOAD

  • DataSource 0FC_INVDOC_00 - Extraction

    Hi Experts,
    I am using 0FC_INVDOC_00 datasource for extracting invoicing data to BW. Data is not coming to BW .
    I refered this link- http://help.sap.com/erp2005_ehp_03/helpdata/EN/43/4d3eb20420321ae10000000a422035/frameset.htm.
    I am using this standard datasource, did initialisation as mentioned in this link.
    Another SDN thread- 0FC_INVDOC_00 FICA Extraction of Invoicing Document Data , here it is mentioned that i need to run mass activity type-2620 & then activate the event-2710.
    I checked with my project FI consultants, they told mass activity-2620 is for Tax rate related which we are not using.
    As mentioned in the sap help link, 
    step 2.Processing of extraction orders (delta extraction)
    You start the extraction order from the SAP menu by choosing Invoicing -> Extraction for Business Intelligence -> BI Extraction Invoicing Documents.
    I am not getting these options in my R/3(ISU) system.
    Please tell me the steps to extract data for this datasource & what is the mass activity need to be done in ISU system.
    Help me .
    Regards,
    Asish

    Hi,
    Can you please share the How..to / Steps by which this issue was resolved ?
    regards,
    Rajesh

  • Extracting Record Model information from RMPS(RMS)

    Hello gurus,
    Have anyone dealt with extracting record model data from RMPS, showing it in Portal.
    Any standard functionality?

    Hello Kairat,
    Please go through the below link, it provides you the essential information.
    https://help.sap.com/saphelp_nw70/helpdata/EN/90/18413a38cec628e10000000a11402f/frameset.htm
    And you can post the Records and Case Management related quires in the document management forum.
    Regards
    Keerthika

  • Posssible to redirect the content of SCREEN during init/userspace?

    I went to IRC the other day and asked where the userspace errors are usually located, a lot of users pointed me in the /var/log direction. But I can't for the life of me find anything that's related to userspace, only kernel errors.
    Example: (sorted by now) netcfg complains during init theres a profile error bla bla, naturally I try to grep "Profile x error" in every log from /var/log but the result gives me nothing.
    Where do we locate standard userspace errors? if there's no way other than using scroll lock, can't we take the output and direct it to a log instead?
    Thank You

    bender02 wrote:
    greenfish wrote:Where do we locate standard userspace errors? if there's no way other than using scroll lock, can't we take the output and direct it to a log instead?
    I think it's the matter of each "userspace" program to allow and arrange for logging of its output, via the config file and/or by command line switches, or automatically. If you need to capture output of a particular program/daemon run during the startup, you can always restart it (sudo /etc/rc.d/<daemon_name> restart) to get its output again, or better, take a look at how is the program actually run by looking at the script /etc/rc.d/<daemon_name> and try to run it from command line manually (especially if it doesn't allow reasonably logging its output).
    sorry for the delay guys, I guess you have to manually subscribe to each thread in order to recieve an update.
    bender02 hmm I see, I guess that makes sence but still it's gonna be a whole mess going through the entire daemon path just to see which one is causing the errors, and those are just daemons, what about standard errors in userspace? Your idea is sound though, now I only need to find out exactly what process/daemon is causing my mess and write the output > file. Thanks dude!
    @sjovan problem resolved, but most of the errors I get in userspace isn't visible with "dmesg" thanks anyways!

  • [SOLVED] alsa daemon error during init with Thinkpad X220

    Hi,
    I just updated my system (including a kernel update). After reboot, I found alsa daemon, which is used to restore previous volume level, failed to start. I tried to start that daemon manually after the boot process is completed, and that produced an error message
    sudo rc.d start alsa
    :: Restoring ALSA Levels [BUSY]
    Found hardware: "HDA-Intel" "Intel CougarPoint HDMI" "HDA:14f1506e,17aa21da,00100000 HDA:80862805,80860101,00100000" "0x17aa" "0x21da"
    Hardware is initialized using a generic method
    [FAIL]
    However, the volume level restored successfully. It seems that recent Intel plantform provides two soundcard, and that error only affected the secondary card, which has nothing to do about system volume level.
    Anyway, I hate to have a ``[FAIL]'' During init process. Any suggesting to get rid of that problem? Thanks
    System Info: (I use linux-ck as my kernel with modified config)
    uname -a
    Linux Thomas 3.2.1-2-ck #1 SMP PREEMPT Sun Jan 15 00:16:41 CST 2012 x86_64 Intel(R) Core(TM) i7-2620M CPU @ 2.70GHz GenuineIntel GNU/Linux
    lspci
    00:00.0 Host bridge: Intel Corporation 2nd Generation Core Processor Family DRAM Controller (rev 09)
    00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)
    00:16.0 Communication controller: Intel Corporation 6 Series/C200 Series Chipset Family MEI Controller #1 (rev 04)
    00:19.0 Ethernet controller: Intel Corporation 82579LM Gigabit Network Connection (rev 04)
    00:1a.0 USB controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #2 (rev 04)
    00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 04)
    00:1c.0 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 1 (rev b4)
    00:1c.1 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 2 (rev b4)
    00:1c.3 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 4 (rev b4)
    00:1c.4 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 5 (rev b4)
    00:1c.6 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 7 (rev b4)
    00:1d.0 USB controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #1 (rev 04)
    00:1f.0 ISA bridge: Intel Corporation QM67 Express Chipset Family LPC Controller (rev 04)
    00:1f.2 SATA controller: Intel Corporation 6 Series/C200 Series Chipset Family 6 port SATA AHCI Controller (rev 04)
    00:1f.3 SMBus: Intel Corporation 6 Series/C200 Series Chipset Family SMBus Controller (rev 04)
    03:00.0 Network controller: Intel Corporation Centrino Advanced-N + WiMAX 6250 (rev 5e)
    0d:00.0 System peripheral: Ricoh Co Ltd Device e823 (rev 07)
    0e:00.0 USB controller: NEC Corporation uPD720200 USB 3.0 Host Controller (rev 04)
    config.gz:
    http://codepad.org/XtUSbtdw
    (Edit typo)
    Last edited by cap_sensitive (2012-01-17 06:12:01)

    This morning, I performed a system update and I did not see any errors with alsa when I rebooted.  Audio hardware is:
    00:1b.0 Audio device: Intel Corporation 5 Series/3400 Series Chipset High Definition Audio (rev 05)
    Have you tried manually restarting alsa after the system boots up to see if you get the same error?  You can restart alsa with the following command:
    sudo /etc/rc.d/alsa force-restart
    Post your results.  If the error occurs during the manual restart, it should also include some details about what caused the error

Maybe you are looking for

  • Double login for application

    Good evening, Can you help me, my trouble is: I have a WDA with my screen elements. Some of tjhs elements have standart search help. When i run my application, i am login to the system, but when i press F4 on the field, login window to the system run

  • Time Capsule won't show when trying to extend my network

    I am not too bright at networking, but I will try my best to explain my problem. I have an existing wireless network with a non-Apple router. I use both a PC and a MacBook on this network. I am trying to use my Time Capsule to extend my network into

  • How to get retro pay in payroll

    How to get retro pay in bw.

  • Help with using internet sharing on my MacBook Pro with an old mac mini

    I have an old mini osx 10.4.11 with no wifi.  My much newer MacBookPro osx10.6.7 can't seem to share it's internet connection between it and the mini via ethernet cable.  I'm using the MacBook airport to connect to my SMC router. The DHCP on the mini

  • Two years trying to get Toshiba Laptop Wireless Connection to work

    I've been trying for the last two years to try and get the wireless connection on my Toshiba Laptop to work without much sucess. I did for a short time manage to establish an ad-hoc wireless connection from the Laptop to a Desktop and share the inter