Split large .pptx file to multiple files using PowerShell

Hi.
I'm not a programmer and I have the task to split big .pptx file(>100 slides) to multiple .pptx files with 4 slides in each one and save them into SP library. It should be done on PowerShell.
Thanks for any help!

Hi,
For splitting PowerPoint files into multiple parts, I would suggest you post this question to Office forum, you will get more help and confirmed answers there:
http://social.technet.microsoft.com/Forums/office/en-US/home
For uploading files to SharePoint Library using PowerShell, here are some links with script demos provided for your reference:
http://social.technet.microsoft.com/wiki/contents/articles/19529.sharepoint-2010-upload-file-in-document-library-using-powershell.aspx
http://spfileupload.codeplex.com/
Best regards
Patrick Liang
TechNet Community Support

Similar Messages

  • Syndicator Server: possible to split the export file in multiple file?

    Hello,
    is it possible to split the export files into multiple files e. g. at 5000 records?
    Perhaps there is a possibility in the mdss.ini? Or a setting in the mapping?
    Thank your for your responses!
    Melanie

    Hi Melanie,
    - If you are syndicating in the Xml format you have the option to syndicate one xml for every record. or multiple records one xml.
    For this you have to make a simple setting in the Syndicator-> Map properties->XML file output->(multiple files/single file)
    Other way around is:
    - If you are syndiacting in any other format say text then the output file goes as one file for all records present in the MDM repository, for this you can use the search options.
    - Create a search on some field value which will select a set of records from the lot.
    - Then you can syndicate only those records which satisfies the  search criteria.
    - In this way it is possible to syndicate in parts
    Hope It Helped,
    Kindly Reward Points if found useful
    Thanks & Regards
    Simona Pinto

  • Split download file into multiple files

    Hi all,
    I'd like to split my internal table into multiple files when downloading. The split will be determined by contents of a specific field (e.g. create a new file for every unique country, where country is a key field). The number of files to download can only be determined at runtime.
    Any ideas on how i can resolve this?
    Thanks,
    Chris.

    try this...
    *declare itab1 same as itab
    data : filename(25).
    loop at itab.
    <b>  move-corresponding itab to itab1.
      append itab1.</b>
      at end of BUKRS.
            *use GUI_DOWNLOD    FM here
       *pass itab1 every time here
       *change the filename everytime
         concatenate 'Company_' itab-bukrs into filename.
    <b>   refresh itab.</b>
      endat.
    endloop.

  • Split TempDB Data file into multiple files

    Hey , 
    I have been seeing TempDB contention in memory on our SQL server 2012 Enterprise Edition with SP2 and I need to split TempDB Data file into multiple files .
    Could someone please help me to verify the following information:
    1]
    We are on SQL server 2012 Enterprise Edition with service pack2 but as per SQL Server 2012 Enterprise Edition under CAL Licensing –We are limited to use 20 logical processors instead 40 logical processors. Our SQL is configured
    on NUMA nodes and with the limitation SQL uses only 2 NUMA nodes on live .There are 10 logical CPUs are evenly assigned to each NUMA nodes. Microsoft recommends that if SQL server configured on NUMA node and we have 2 NUMA nodes, then we may add two data files
    at a time. Please let me know should I add two TempDB data file at a time?
    2] We have TempDB Data and log files both on the same Drive of SQL server  .When I split TempDB into two Data files, I can get them on the same Drive .What your recommendation should I need to create TempDB Data files on the same drive or on separate
    disks?
    3] What would be the blackout plan for splitting the tempdb into multiple files? Please let me know if someone has a better back out plan ?
                1] Run script that create tempdb Database with a single file
    2] Reboot SQL service in order to apply change   
    Your help will be apprecited .
    Thanks ,
    Daizy

    Tom , I am seeing TempDB contention on Production server when there is a heavily load on sql server . We also experiencing the overall system slowness.Please look at Pagelatch wait statistics on our server ,Please advise .
    wait_type
    waiting_tasks_count
    wait_time_ms
    max_wait_time_ms
    signal_wait_time_ms
    PAGELATCH_UP
    2680948
    3609142
    10500
    508214
    PAGELATCH_SH
    1142213
    1338451
    8609
    324538
    PAGELATCH_NL
    0
    0
    0
    0
    PAGELATCH_KP
    0
    0
    0
    0
    PAGELATCH_EX
    44852435
    7798192
    9886
    6108374
    PAGELATCH_DT
    0
    0
    0
    0
    Thanks ,
    Daizy

  • Save file in multiple directories using receiver file adapter?

    Hi,
    Is it posible to save file in multiple directories using receiver file adapter?
    Regards,
    Ashish

    Well, there is a round about way to do that -
    The idea is to use multi mapping. 1:n mapping
    1) Map ur message to 2 different record set nodes. Since you want to use the same file both the mapping will look exactly the same. make sure that the filepath and filename are a part of the output payload message
    2) In the file adapter config. make sure the the file name and file path are from these payload fields. You can use a context object to refer these fields. Voila...the files are created in the 2 direcoties you mentioned.
    of course the simplest way is to route the same message to 2 business systems/services and write them out using 2 ccs.,
    Arvind R

  • Split text file in multiple files based on a string

    Hey all,
    I want to split a text file into multiple files. I already found some examples where there is a split based on a number of files.
    http://forum.java.sun.com/thread.jspa?forumID=256&threadID=260930
    But I want to make a split based on a string (word) that I find in the file.
    Anyone that can help me ?
    Regards,
    Atmoz

    This is my testing code like it is now. Maybe there is a bug in there which causes a memory leak or so.
    public class test {
         public static void main(String args[]) {
              File sSourceDir = new File("D:\\Test\\");
              File sDestinationDir = new File("D:\\Test\\");
              File[] files = sSourceDir.listFiles(new Filter());
              for (int i=0; i<files.length; i++) {
                   File file = files;
                   if (file.isFile()) {
                        System.out.println("Splitting file: "+files[i]);
                        splitFile(file,sDestinationDir);
                   else {
                        System.out.println("Not a file: "+files[i]);
         public static File splitFile(File fSourceFile, File sDestinationDir) {
              int counter = 1;
              File fDestinationFile=new File(sDestinationDir,"NEW_"+counter+"_"+fSourceFile.getName());
              fDestinationFile.delete();
              String sLineOfData=null;
              boolean firstfile = true;
              try {
                   BufferedReader DataFileReader = new BufferedReader(new FileReader(fSourceFile));
                   PrintWriter outputStream = new PrintWriter(new FileWriter(fDestinationFile));
                   while ((sLineOfData = DataFileReader.readLine()) != null){
                        System.out.println(sLineOfData);
                        if (sLineOfData.indexOf("UNA:+") != -1) {
                             if (!firstfile) {
                                  counter++;
                                  fDestinationFile=new File(sDestinationDir,"NEW_"+counter+"_"+fSourceFile.getName());
                                  outputStream.close();
                                  outputStream = new PrintWriter(new FileWriter(fDestinationFile));
                                  outputStream.println(sLineOfData);     
                             else {
                                  firstfile = false;
                                  outputStream.println(sLineOfData);     
                        else {
                             outputStream.println(sLineOfData);
                   outputStream.close();
                   DataFileReader.close();
              } catch (FileNotFoundException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
              } catch (IOException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
              return fSourceFile;
    And this is an example of a file:
    PS: I cut out each long line (that line from 4000 chars)
    UNA:+,? '
    UNB+UNOC:3d+5499757493404:14+3014331700208:14+050114:1200+ACC302++STS.GZ++1++1'
    UNH+I15185477+UTILTS:D:03B:UN:E5BE03'BGM+E32::260+I15185477+9+NA'
    DTM+137:200501141151:203'DTM+735:?+0100:406'MKS+23'NAD+MR+3014331700208::9'
    UNA:+,? '
    UNB+UNOC:3+549975f7493404:14+3014331700208:14+050114:1200+ACC302++STS.GZ++1++1'
    UNH+I15185477+UTILTS:D:03B:UN:E5BE03'BGM+E32::260+I15185477+9+NA'
    DTM+137:200501141151:203'DTM+735:?+0100:406'MKS+23'NAD+MR+3014331700208::9'
    DTM+137:200501141151:203'DTM+735:?+0100:406'MKS+23'NAD+MR+3014331700208::8'
    UNA:+,? '
    UNB+UNOC:3g+5499757g493404:14+3014331700208:14+050114:1200+ACC302++STS.GZ++1++1'
    UNH+I15185477+UTILTS:D:03B:UN:E5BE03'BGM+E32::260+I15185477+9+NA'
    DTM+137:200501141151:203'DTM+735:?+0100:406'MKS+23'NAD+MR+3014331700208::9'Message was edited by:
    Atmozzz

  • How to update managed metadata column for all file in document library using powershell

    Hi,
    How to update managed metadata column for all file in document library using powershell?
    Any help on it.
    Thanks & REgards
    Poomani Sankaran

    Hi TanPart,
    I have changed the code which you have give in order to get the files from SharePoint 2010 Foundation  Document Library.But i am getting below error in powershell.
    Property 'ListItemCollectionPosition' cannot be found on this object; make sure it exists and is settable.
    Could you tell me which is the issues in it?
    See the code below.
    $web = Get-SPWeb http://ntmoss2010:9090/Site
    $list = $web.Lists["DocLib"]
    $query = New-Object Microsoft.SharePoint.SPQuery
    $query.ViewAttributes = "Scope='Recursive'";
    $query.RowLimit = 2000
    $caml = '<Where><Contains><FieldRef Name="Title" /><Value Type="Text">Process Documents/Delivery</Value></Contains></Where>' +
            '<OrderBy Override="TRUE"><FieldRef Name="ID"/></OrderBy>'
    $query.Query = $caml
    do
        $listItems = $list.GetItems($query)
        $spQuery.ListItemCollectionPosition = $listItems.ListItemCollectionPosition
        foreach($item in $listItems)
            #Cast to SPListItem to avoid ambiguous overload error
            $spItem = [Microsoft.SharePoint.SPListItem]$item;
            Write-Host $spItem.Title       
    while ($spQuery.ListItemCollectionPosition -ne $null)
    Thanks & Regards
    Poomani Sankaran

  • CS3: Saving or Outputting .AI File as Multiple Files

    There must be a way to save or output a single Adobe Illustrator file into multiple files based on the "pages" I have created in the document?
    For example, if I have tiled 10 tabloid-sized pages onto a single Illustrator CS3 .AI file, is it possible to save the file in a manner that will allow all of those tiled pages to be saved as a seperated or individual files?
    I guess it would be a similar concept to how in newer versions (like CS4) you can export multiple jpegs of an illustrator file that contains several artboards and it automatically assigns a number suffix to the filename since multiple files were outputted from the single ai file.
    Any help is MUCH appreciated. Thanks!

    There must be a way to save or output a single Adobe Illustrator file into multiple files based on the "pages" I have created in the document?
    For example, if I have tiled 10 tabloid-sized pages onto a single Illustrator CS3 .AI file, is it possible to save the file in a manner that will allow all of those tiled pages to be saved as a seperated or individual files?
    I guess it would be a similar concept to how in newer versions (like CS4) you can export multiple jpegs of an illustrator file that contains several artboards and it automatically assigns a number suffix to the filename since multiple files were outputted from the single ai file.
    Any help is MUCH appreciated. Thanks!

  • Pull a zip file with multiple files unzip it, and finally load one of the f

    Hi Aill,
    I have following query.
    Could we do the following with XI: pull a zip file with multiple files from a vendor which resides outside of XI server network, unzip it, and finally load one of the files to SAP?
    Regards
    Rohan S

    Hi Varadharajan,
    I have 10 text files in some ZIP file on one server which is out side of network. We can reach that file through Proxy only to read ZIP file.
    I need to extract the file and on the bases of some condition one of the file i need to upload data to SAP.
    Is it possible. If possible then How?
    Regards

  • Set storage quota on multiple mailboxes using PowerShell?

    I need to set storage quota limits on multiple mailboxes using PowerShell. I understand I can create a .csv file with aliases and pipe that into a cmdlet, eg.,
    Import-CSV "C:\temp\alias.csv" | % {Set-Mailbox -identity $_.alias -IssueWarningQuota 900mb -ProhibitSendQuota 950mb -ProhibitSendReceiveQuota 1gb -UseDatabaseQuotaDefaults $false
    Is there any other way of doing this with a much more robust script?
    Any help would be much appreciated.

A: Set storage quota on multiple mailboxes using PowerShell?

Hi,
Is there any special attribute for these multiple mailboxes? such as they are from a specific OU or a distribution group etc.
If there is, we can direct use the filter to pick out these mailboxes instead of create .csv file for them. The following example can set storage quota for mailboxes in a distrobution group Group1:
Get-DistributionGroupMember -Identity Group1 | ForEach{ Set-Mailbox -identity $_.Name -IssueWarningQuota 900mb -ProhibitSendQuota 950mb -ProhibitSendReceiveQuota 1gb -UseDatabaseQuotaDefaults $false}
The following example is used to set storage quota for mailboxes from Exchange Department:
Get-Recipient | Where-Object {$_.Department -eq 'Exchange'} | ForEach{ Set-Mailbox -identity $_.Name -IssueWarningQuota 900mb -ProhibitSendQuota 950mb -ProhibitSendReceiveQuota 1gb -UseDatabaseQuotaDefaults $false}
Regards,
Winnie Liang
TechNet Community Support

Hi,
Is there any special attribute for these multiple mailboxes? such as they are from a specific OU or a distribution group etc.
If there is, we can direct use the filter to pick out these mailboxes instead of create .csv file for them. The following example can set storage quota for mailboxes in a distrobution group Group1:
Get-DistributionGroupMember -Identity Group1 | ForEach{ Set-Mailbox -identity $_.Name -IssueWarningQuota 900mb -ProhibitSendQuota 950mb -ProhibitSendReceiveQuota 1gb -UseDatabaseQuotaDefaults $false}
The following example is used to set storage quota for mailboxes from Exchange Department:
Get-Recipient | Where-Object {$_.Department -eq 'Exchange'} | ForEach{ Set-Mailbox -identity $_.Name -IssueWarningQuota 900mb -ProhibitSendQuota 950mb -ProhibitSendReceiveQuota 1gb -UseDatabaseQuotaDefaults $false}
Regards,
Winnie Liang
TechNet Community Support

  • AppleScript Split text file into multiple files

    I have large text files that I would like to break up into individual files. They are compilations of articles; I want to create a new file for each article, which begins with:
         Document # of #
    I don't think any other lines start with "Document" so that might be fine as a delimiter.
    Also, it would be great if the new files could be named "LAT_1", "LAT_2", etc.
    I found this thread:
    AppleScript separating one text file into many at Paragraph break
    I tried the second script by twtwtw and changed
    set parsingText to "Document"
    but got the following error:
    error "Can’t get item 2 of {\"LAT 1985-1990 local food copy.txt\"}." number -1728 from item 2 of {"LAT 1985-1990 local food copy.txt"}
    I also found this thread but it was too complicated for me to understand. Re: Split a large text file with many entries into separate files
    Can anyone help me modify the script? I've been looking at this as well as Unix commands to use in Terminal, but I'm just too much of a beginner.
    Any assistance is much appreciated!
    Trisha

    Hello
    Here's a modified version of the script in
    Split a large text file with many entries into separate files
    https://discussions.apple.com/thread/5641125?tstart=0
    This will split the given text by delimiter line in the form of "Document # of #" and save each chunk of text from one delimiter to the preceding line of the next delimiter in separate file named after LAT_0000.txt, where 0000 is sequential number starting at 0001.
    Note that any text before the first delimiter is ignored and not saved in output file. You may select multiple input files. Existing file in the destination folder is overwritten. Input text is assumed to be in UTF-8 and each line is termitated by U+000A LINE FEED.
    set ff to choose file with prompt "Choose input file(s)." with multiple selections allowed
    set d to choose folder with prompt "Choose destination folder."
    set args to ""
    repeat with a in {d} & ff
        set args to args & " " & a's POSIX path's quoted form
    end repeat
    do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & args & "
    use strict;
    sub usage () {
        printf STDERR \"Usage: %s output_directory file [file ...]\\n\", $0;
        exit 1;
    &usage() unless @ARGV > 1;
    my $outdir = shift @ARGV;
    unless ( -d $outdir ) {
        printf STDERR \"Not a directory: %s\\n\", $outdir;
        &usage();
    my $i = 0;
    my ($n, $n0, $t);
    while (<ARGV>) {
        if ( /^ \\s* Document \\s+ [0-9]+ \\s+ of \\s+ [0-9]+ .* $/ox ) {
            $n = sprintf('LAT_%04d.txt', ++$i);             # e.g., LAT_0001.txt
            if ( $n0 ) {
                open(OUT, '>', \"$outdir/$n0\") or die $!;  # overwrite
                print OUT $t;
                close OUT;
            ($n0, $t) = ($n, $_);
        else {
            $t .= $_;
    if ( $n0 ) {
        open(OUT, '>', \"$outdir/$n0\") or die $!;          # overwrite
        print OUT $t;
        close OUT;
    EOF"
    Hope this may help,
    H

  • How to split Large mp4 and AVI video files to smaller scenes

    Hi All
    I’ve been looking into how to cut-up large files ready for import into CS4. So far all I can find are the usual suspects that only lets you cut a section from a file. Dose anyone use any software that enables you to split a large video file (MP4, AVI and so on) into say 20 sections all in one hit?
    I do a lot of HD onboard cams, so the video is set to fire and only shut off when the run has finished. I then end up with about 60 percent of the file (in diferent parts) I need to trash.
    Any help or advice would be much appreciated indeed!
    Xray

    function(){return A.apply(null,[this].concat($A(arguments)))}
    the_wine_snob wrote:
    Maybe, but maybe not. I use DigitalMedia Converter to convert to DV-AVI Type II's (I'm only doing SD), and it has a Split function. However, I have never used that, so do not know how well it might work for your needs, if at all. I just do not know. I believe that Deskshare has a user forum, and that might be a good place to try, after you've looked down their FAQ's.
    I hope that others will have a definitive answer for you, with iron-clad suggestions.
    Good luck,
    Hunt
    i don't know.

  • C# Split xml file into multiple files

    Below i have an xml file, in this file, i need to split this xml file into multiple xml files based on date column value,
    suppose i have 10 records with 3 different dates then all unique date records should go into each file . for ex here i have a file with three dates my output should get 3 files while each file containing all records of unique date data. I didn't get any idea
    to proceed on this, thats the reason am not posting any code.Needed urgently please
    <XML>
    <rootNode>
    <childnode>
    <date>2012-12-01</date>
    <name>SSS</name>
    </childnode>
    <childnode>
    <date>2012-12-01</date>
    <name>SSS</name>
    </childnode>
    <childnode>
    <date>2012-12-02</date>
    <name>SSS</name>
    </childnode>
    <childnode>
    <date>2012-12-03</date>
    <name>SSS</name>
    </childnode>
    </rootNode>
    </XML>

    Here is full code:
    using System.Xml.Linq;
    class curEntity
    public DateTime Date;
    public string Name;
    public curEntity(DateTime _Date, string _Name)
    Date = _Date;
    Name = _Name;
    static void Main(string[] args)
    XElement xmlTree = new XElement("XML",
    new XElement("rootNode",
    new XElement("childnode",
    new XElement("date"),
    new XElement("name")
    string InfilePath = @"C:\temp\1.xml";
    string OutFilePath = @"C:\temp\1_";
    XDocument xmlDoc = XDocument.Load(InfilePath);
    List<curEntity> lst = xmlDoc.Element("XML").Element("rootNode").Elements("childnode")
    .Select(element => new curEntity(Convert.ToDateTime(element.Element("date").Value), element.Element("name").Value))
    .ToList();
    var unique = lst.GroupBy(i => i.Date).Select(i => i.Key);
    foreach (DateTime dt in unique)
    List<curEntity> CurEntities = lst.FindAll(x => x.Date == dt);
    XElement outXML = new XElement("XML",
    new XElement("rootNode")
    foreach(curEntity ce in CurEntities)
    outXML.Element("rootNode").Add(new XElement("childnode",
    new XElement("date", ce.Date.ToString("yyyy-MM-dd")),
    new XElement("name", ce.Name)
    outXML.Save(OutFilePath + dt.ToString("yyyy-MM-dd") + ".xml");
    Console.WriteLine("Done");
    Console.ReadKey();

  • Delete multiple files from multiple locations using config file

    Hello All,
    I am fairly new to .net and I have to make a windows application to delete files older than 90 days from multi ple locations. these multiple locations cant be hard coded and have to be configurable, also the number of days to delete the files have to be
    configurable.
    I think the answer to this would be an app.config file. but dont know how to use the same.
    can anyone tell me about this and if possible help me out with a code snippet.?

    Add settings like picture below.  Add the string by press the double dot icon.
    The use the code below
    static void Main(string[] args)
    int deleteDays = Properties.Settings.Default.DeleteDays;
    StringCollection folders = Properties.Settings.Default.Folders;
    foreach (string folder in folders)
    DirectoryInfo info = new DirectoryInfo(folder);
    FileSystemInfo[] deleleteFiles = info.EnumerateFiles()
    .Where(x => x.LastWriteTime < DateTime.Today.AddDays(deleteDays))
    .ToArray() ;
    foreach(FileSystemInfo file in deleleteFiles)
    File.Delete(file.FullName);
    jdweng

  • How to read multiple files at multiple locations using properties file

    hi all ,
    in my code i take configuration input from a properties file ,it was really easy but what if there are multiple files ? how can i take configuration input (file path etc) from properties file in that case ? any suggestion ?

    thanks :) ..i am doing like this :
    String fName[]=new String[10];
              String choice[]=new String[10];
              String Delim[]=new String[10];
              ResourceBundle a = ResourceBundle.getBundle("input");
              String n=a.getString("NOF");
              try
                   for(int i=0;i<Integer.parseInt(n);i++)
                        fName=a.getString("PATH"+(i+1));
                        choice[i]=a.getString("HEADER"+(i+1));
                        Delim[i]=a.getString("DELIMITER"+(i+1));
                        putData(fName[i],choice[i],Delim[i]);//to put data in database
              catch (NumberFormatException e)
                   System.out.println("Number of files are invalid");
    hope this method would be fine ....again thanks for the advice :) :)                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Maybe you are looking for