How to extract audit log data from every document library in site collection using powershell?

Hi All,
I have n number of document library in one site collection,
My query is- How to extract audit log data from every document library in a site collection using powershell?
Please give solution as soon as possible?

Hi inguru,
For SharePoint audit log data, These data combine together in site collection. So there is no easy way to extract audit log data for document library.
As a workaround, you can export the site collection audit log data to a CSV file using PowerShell Command, then you can filter the document library audit log data in Excel.
More information:
SharePoint 2007 \ 2010 – PowerShell script to get SharePoint audit information:
http://sharepointhivehints.wordpress.com/2014/04/30/sharepoint-2007-2010-powershell-script-to-get-sharepoint-audit-information/
Best Regards
Zhengyu Guo
TechNet Community Support

Similar Messages

  • How to extract audit log from R/3 into BW

    Hi, I have a request that how to extract audit log from r/3 into bw?
    Is there any datasource or infocube  I can use?

    HI ,
    Identify your Audit Log table and create generic data source for the same using rso2. 
    Regards
    BVR

  • How to transfer service ticket data from ECC6.0 to CRM 7.0 using LSMW?

    how to transfer service ticket data from ECC6.0 to CRM 7.0 using LSMW?
    Plz suggest me any BAPI or IDOC for that ..
    thanks in Advance .

    You have to convert your long string to a table of shorter strings.
    There may be other ways, but one possibility is to use a loop to process you string.
    while (there is something left)
       put the next e.g. 1024 characters in a new row of your table
    endwhile
    If you need to reconstruct your string from the table, don't use simple concatenation since it will remove blanks at the end of lines. Believe me (from experience) sooner or later this will happen.
    Instead you need to either set the subsections of your long string, or insert from the end of your table and keep shifting the contents (probably less efficient) right

  • How to extract the historical data from R/3

    hi
    I am extracting data from R/3 through LO Extraction. client asked me to enhance the data source by adding field. i have enhanced the field and wrote exit to populate the data for that field.
    how to extract the historical data into BI for the enhanced field. already delta load is running in BI.
    regards

    Hi Satish,
    As per SAP Standard also the best way is to delete whole data from the cube and then load the data from set up tables as you have enhanced the data source.
    After data source enhancement it is supported to load normally because you don't get any historical data for that field.
    Best way is to take down time from the users, normally we do in weekends/non-business hours.
    Then fill the set-up tables; if the data is of huge volume you can adopt parallel mechanism like:
    1. Load set-up tables by yearly basis as a background job.
    2. Load set-up tables by yearly basis with posting periods from jan 1st to 31st dec of any year basis as a background job.
    This can make your self easier and faster for load of set-up tables. After filling up set-up tables. You can unlock all users as there is no worries of postings.
    Then after you can load all the data into BI first into PSA and then into Cube.
    Regards,
    Ravi Kanth.

  • How to extract 64 bit data from imaq image using IMAQ Extract VI

    I have LV 8.5.1, Vision 8.5 and need to extract 64 bit data from a 64 bit image and I get the "invalid image" error while using the IMAQ Extract VI.  What version of Vision do I need to allow me to do this? 
    Currently, the work-around I have...
    1) convert the image to 32bit
    2) use the ROI tools I to get the rectangle data I need
    3) then go back to the original image and the convert the image to a 64 bit array
    4) take the rectangle data to extract the data needed out of the 64 bit array data.
    klunky but it works.  I would think that the IMAQ Extract tool should allow me to extract the 64 bit data but it doesnt... forces me to 32 bit.
    suggestions?

    steve05ram360 wrote:
    awesome, that does work. 
    Attached DLL slightly corrected and should be OK also "in place" when Dst is not connected like original IMAQ function. Hopefully it works properly now. By the way all IMAQ types are supported, not only U64.
    Andrey.
    Attachments:
    ADVExtractDLL.zip ‏9 KB

  • How to get the raw data from particular document's schedule ?

    Hello,
    I am now able to get the data from a document usign RESTful Web Services SDK and what I need is to
    get the data not from the current version of the document but from the schedule that were executed some time ago
    with the older data than the current data.
    Any hints ?

    Hey Jacek,
    Please, look at the /schedules into Raylight API.
    Regards,
    Anthony

  • How to extract HRM master data from R/3 into LDIF file?

    Recently I have been asked to provide an extract from our R/3 system
    with some Human Resource master data. The extract has to be in the LDIF
    format (LDAP data interchange format). It is needed to import into a
    DirX metahub solution from Siemens.
    How can this be done most easily?
    (does SAP provide tools, can XI do this?) or do we have to write a
    customized abap to do this?
    Thanks in advance
    Kind regards
    Alex Veen

    Hi Satish,
    As per SAP Standard also the best way is to delete whole data from the cube and then load the data from set up tables as you have enhanced the data source.
    After data source enhancement it is supported to load normally because you don't get any historical data for that field.
    Best way is to take down time from the users, normally we do in weekends/non-business hours.
    Then fill the set-up tables; if the data is of huge volume you can adopt parallel mechanism like:
    1. Load set-up tables by yearly basis as a background job.
    2. Load set-up tables by yearly basis with posting periods from jan 1st to 31st dec of any year basis as a background job.
    This can make your self easier and faster for load of set-up tables. After filling up set-up tables. You can unlock all users as there is no worries of postings.
    Then after you can load all the data into BI first into PSA and then into Cube.
    Regards,
    Ravi Kanth.

  • How to Extract  FORM 16 Data from r/3

    Hi Experts,
    My client wants to see FORM 16 Report in BI.Is there any BI Content Extractor to Extract FORM 16 Data to BI.
    If any one worked on this Please Explain How to Get the FORM 16 Data from R/3 to BW.
    Thanks,
    -Vijay

    Hi
    I do not know the exact thing what you should do
    ...but i suggest you the following
    Go to Tcode SE71
    Press F4.
    Expand Payroll-> <Country>->Income Tax
    Here U can get the Form name.
    Try to see the tables and fields here and spot it any standard extractors ....if not availbale do generic
    Assign points if useful
    Regards
    N Ganesh

  • How to extract the budget data from essbase to Ms access database

    Hi,
    i want to know how to extract budget dat fro hyperion to Ms access database
    please help me in this regard how to proceed, and what are the process i need to follow
    Regards
    Hypuser

    You can write a Calc script to export only the 'Budget' data and loaded back the exported file into MS Database via Loader or import.
    Cheers
    Cnee:)

  • How to delete Change log data from a DSO?

    Hello Experts,
    I am trying to delete the change log data for a DSO which has some 80 Cr records in it.
    I am trying to follow the standard procedure by using the process chain variant and giving the no.of days but somehow data is not getting deleted.
    However the process chain is completing successfully with G state.
    Please let me know if there are any other ways to delete the data.
    Thanks in Advance.
    Thanks & Regards,
    Anil.

    Hi,
    Then there might something wrong at your Chang log deletion variant.
    can you recreate changlog deletion variants and set them again.
    Ty to check below settings with new variant.
    Red mark - won't select
    Provide dso name and info area, older than and select blue mark.
    blue mark - it will delete only successfully loaded request which are older than N days.
    Have you tested this process type changlog deletion before moving to prod per your data flow?
    Thanks

  • Fastest way to get data from Multiple lists across multiple site collections

    HI
    I need to get data from multiple lists which spread across 20 site collections and need to show it as list view.
    I have searched on internet about this and got some info like options would be to use search core APIs or BCS . I can't use search because I want real time data. Not sure of any other ways.
    if anybody can provide ideas it would be help.

    Might LINQ be an option for you?  Using
    LINQPad and the
    SharePoint Connector, you should be able to write a query that'll retrieve this data, from which you can tabulate it.  I'm not sure how you'd be able to automate this any further so that it's then imported in as list.
    For something more specific, I used a third party tool called the
    Lightning Tools Lightning Conductor, which is essence a powerful content roll-up tool.  In one of my solutions, I created a calculated column that gave an order / ranking on each item, so that when lists were combined, they'd still have some form of
    order.  The web part is also fairly customisable and has always proven a useful tool.
    Hope that helps.
    Steven Andrews
    SharePoint Business Analyst: LiveNation Entertainment
    Blog: baron72.wordpress.com
    Twitter: Follow @backpackerd00d
    My Wiki Articles:
    CodePlex Corner Series
    Please remember to mark your question as "answered" if this solves (or helps) your problem.

  • How to fetch folders and subfolders from sharepoint document library

     I have document library with name . Under "Documents" there are some folders.Under some folders there are some subfolders.
    I need to fetch the folders in to dropdown list. IF I select some folder in dropdownlist,I need to fetch subfolders of that folder to some other dropdownlist.
    How to acheive this

    The below function get the folder name and id of subfolder where items stored folder = item.folder.  Initially it will be null.
    static
    string GetParentFolder(SPListItem itemToFind,
    SPFolder folder) 
    SPQuery query =
    new SPQuery();
    query.Query = "<Where><Eq><FieldRef Name=\"ID\"/><Value Type=\"Integer\">"+ itemToFind.ID +"</Value></Eq></Where>";
    query.Folder = folder;
    query.ViewAttributes = "Scope=\"Recursive\"";
    SPListItemCollection items = itemToFind.ParentList.GetItems(query);
    int intpartentFolderID=0 ;
    if (items.Count > 0)
    foreach (SPListItem item
    in items)
    SPFile f = item.Web.GetFile(item.Url);
    string test11 = f.ParentFolder.Name;
    intpartentFolderID = f.ParentFolder.Item.ID;
    return (intpartentFolderID.ToString());
             return (intpartentFolderID.ToString());    

  • How to get the user created at and modified at properties for a site collection using powershell

    Hi guys, I Know how to get the list of users of a site collection by Get-SPUser cmdlet but hte problem is that this cmdlet doesnt give me the user Created at and modifed at properties 
    can any one tell me how to get these values via powershell???? 
    ps: ignore the 2013 screenshot.. i just want a way to get those values .. if you provide me solution in either 2010 or 2013 , i will crack the other..
    plz guys help me ...

    Get the User Information list and then get the user from that list
    $web = Get-SPWeb "siteUrl"
    $userInfoList = $web.SiteUserInfoList
    $userItem = $userInfoList.Items[0]; #0 here is just for demonstration. You take the user you want here or loop through all users.
    $created = $userItem["Created"]
    $modified = $userItem["Modified"]

  • [solved]how to extract recent log entries from a file (based on time)?

    I have a daily log file with hundreds of thousands of entries in the following format. 
    field1,field2,field3,field4,field5,field6,field7,field8,field9,20110516192001.100
    field1,field2,field3,field4,field5,field6,field7,field8,field9,20110516192002.200
    field1,field2,field3,field4,field5,field6,field7,field8,field9,20110516192003.300
    field1,field2,field3,field4,field5,field6,field7,field8,field9,20110516192004.400
    field1,field2,field3,field4,field5,field6,field7,field8,field9,20110516192005.500
    It's always in the same format and the 10th field is always the timestamp (YYYYMMDDHHMMSS.MS)
    Since the file rotates daily, the 10th field will always be 20110516xxxxxx.xxx for today and will be 20110517xxxxxx.xxx tomorrow
    What I want to do is only look at entries that have been written in the last 30 minutes.
    At a high level, here's my plan
    1) Get the date/time from 30 minutes ago... write it to a variable
    2) Iterate through the file line by line comparing the 10th field to the variable, if it's larger write the line to a tmp file
    3) Use tmp file for my analysis
    This seems incredibly inefficient to me...  what would be a more graceful way to do it?  I have regular solaris tools at my disposal (plus python)
    Thanks
    Last edited by oliver (2011-05-17 12:41:43)

    The algorith you describe really is a viable approach.  Since this is a log file, each line should have a time stamp later than all lines that preceed it in the file. A more efficient algoithm could do a binary search through the file for the time stamp you are interested in.  This would be easy enough in to do in C or python, but your algoithm could be fast enough. If this is the case, you could try the following quick & dirty bash script.
    #!/bin/bash
    seconds() {
    secs=$(($1 % 100))
    mins=$(($1 / 100 % 100))
    hrs=$(($1 / 10000 % 100))
    days=$(($1 / 1000000 % 100))
    month=$(($1 / 100000000 % 100))
    year=$(($1 / 10000000000))
    (LC_TIME=C date +%s -d $(printf "%d-%02d-%02d %2d:%02d:%02d" $year $month $days $hrs $mins $secs))
    found=0
    now=$(date +%s)
    while read line
    do
    if [ "$found" -eq "0" ]
    then
    ts=${line##*,}
    ts=$(seconds ${ts%.*})
    diff=$(( ($now - $ts)/60 ))
    [[ $diff -lt "30" ]] && found=1
    fi
    [[ $found -ne 0 ]] && echo "$line"
    done < $1
    It will write (to stdout) all lines following the first line that has been time stamped within the last 30 minutes (ignoring milliseconds). You could redirect the output of this script to a file of your choice for analysis as follows:
    $ ./script logfile > tmp
    Last edited by rockin turtle (2011-05-17 06:58:41)

  • HOW WE EXTRACT SALES FLOW DATA FROM VBFA TABLE TO BI THROUGH FUNCTION MODUL

    HI EXPERTS,
    i am working on SD. i need how VBFA data extrcted into BI using function module. how to report relationship between SALESORDER- PGI-INVOICE. can anyone help me.
    thanks in advance

    Hi,
    If you are looking for standard extractor then check the metadata repository, and if you want generic extractor then you can take help of RSAX_BIW_GET_DATA_SIMPLE extractor for writing your own extractors.
    Regards,
    Durgesh.

Maybe you are looking for

  • Need help about audited user.

    Hello folks Some days ago I detected that some user was trying to connect to my database with incorrect credentials, I detected it ´cause that user was blocked after 5 attempts (defined at profile) . So, I audited it and I could find OS user name, us

  • Configuring WL 10.3.5 with ADF 10.1.3.4 and 11.1.1.5

    Hi all, I am quiet new to WebLogic and trying to learn it by reading documentation available online.I need to create two domain(weblogic 10.3.5),one with ADF 10.1.3.4 and other with 11.1.1.5. If anyone has idea Please do reply.Please also write about

  • BAdI code review - for perfomance improvements

    Hi, I've a BAdI implementation to enhance a BW datasource which will extract additional fields for the datasource. The pseudo code for the extractor is as below. Can some one please review and let me know the opportunities (which I'm sure there will

  • Program to upload/download variants?

    Hi,    I needed to copy variants for a program from one system to another.    Is there a program which can be used to download the variants of a report to a file    and can be later used to upload the same variants to the program in the second system

  • Is it possible to automate the load of setup tables for LIS datasources?

    Hello, For LIS datasources, eg. 04 (shop floor control), once the production confirmation is done, we have to manually run the transaction OLI4BW to load the setup tables before extracting data to BW. Is there a way to automate the load of setup tabl