BIP-Weblogic out file growing in size

Hi,
I am noticing that BIP outputs  the standout to weblogic node.out file. This include the query executed by report causing the file size to grow. I am looking for ways to contain what gets spit into outfile by BIP. Any thoughts?

I'm guessing that your system is running OpenSolaris, not Solaris. You should direct your question to [email protected] Better yet, search the archives on opensolaris.org, where you might already find the answer. Sorry, I don't know the solution offhand, other than it has to do with storing the OpenSolaris pkg bits after downloading for offline installation.
-- Alan

Similar Messages

  • What's wrong with the weblogic.log file?

    Currently we got a problem and don't know what it means -
    our weblogic.log file grows very fast by generating huge amount of logs like:
    ####<Feb 18, 2003 10:29:13 AM CST> <Info> <DGCserver> <awhq7232.whq.ual.com> <ECSserver>
    <ExecuteThread: '13' for queue: 'default'> <> <> <000000> <Tried to renew lease
    on lost reference: '1160'>
    I counted and found that weblogic.log adds in at least 50 such logs per minute
    and its size imcreases about 20M per day. But the application still work normally.
    Could you experts help us on the following questions:
    - Why WebLogic Container generates so many "Info" kind of logs?
    - Could we config the WebLogic Server to avoid such "Info" kind of logs being
    written into weblogic.log file?
    Thank you very much!
    Bill Yuan

    It is harmless - it means that by the time server got to process lease
    renewal message object is already gone.
    You may want to check with support - I think there was a fix which supressed
    this message.
    Bill Yuan <[email protected]> wrote:
    Currently we got a problem and don't know what it means -
    our weblogic.log file grows very fast by generating huge amount of logs like:
    ####<Feb 18, 2003 10:29:13 AM CST> <Info> <DGCserver> <awhq7232.whq.ual.com> <ECSserver>
    <ExecuteThread: '13' for queue: 'default'> <> <> <000000> <Tried to renew lease
    on lost reference: '1160'>
    I counted and found that weblogic.log adds in at least 50 such logs per minute
    and its size imcreases about 20M per day. But the application still work normally.
    Could you experts help us on the following questions:
    - Why WebLogic Container generates so many "Info" kind of logs?
    - Could we config the WebLogic Server to avoid such "Info" kind of logs being
    written into weblogic.log file?
    Thank you very much!
    Bill Yuan--
    Dimitri

  • How do I find out a file's particular size?

    Good Evening,
    I am looking for assistance with figuring out a file's particular size within a group of files.
    If I have one large directory and I am looking to find out what the smallest file in that directory is, how would I go about doing it?
    Also, how would I figure out when the file was created on the system or particular hard drive such as a network attached storage device?
    Thank you!

    While command + i (or the Finder Information icon) would give you the file size, that would require a one-by-one approach. Perhaps quicker would be using a utility like "What Size" that will list Folders/files and give the size of each.

  • How to rotate server out file Weblogic 10g

    Hi,
    We are using Weblogic 10 on sunsolaris(unix). we are very frequently seeing teh server out file getting filled quickly and we have to stop, remove the out file and start the servers. Is there any way we can make this weblogic it self take care, means rotate as it does to log files.
    Cheers,

    Thanks, here is what I have in the /bin folder
    Directory of C:\Oracle\Middleware\wlserver_10.3\common\bin
    03/09/2012 09:13 AM <DIR> .
    03/09/2012 09:13 AM <DIR> ..
    03/31/2010 03:46 PM 6,877 commEnv.cmd
    03/31/2010 03:46 PM 18,391 commEnv.sh
    04/14/2010 06:37 AM <DIR> config
    03/31/2010 03:46 PM 2,082 config.cmd
    03/31/2010 03:44 PM 67,584 config.exe
    03/31/2010 03:46 PM 2,382 config.sh
    03/31/2010 03:46 PM 1,493 config_builder.cmd
    03/31/2010 03:46 PM 2,185 config_builder.sh
    03/31/2010 03:39 PM 45,056 console.exe
    03/31/2010 03:39 PM 53,760 consolew.exe
    12/07/2006 06:19 AM 20,480 DfRegistryWin32.dll
    04/14/2010 06:37 AM 472 fileRealm.properties
    03/31/2010 03:47 PM 133 nm_data.properties
    03/09/2012 08:57 AM 133 nm_data.properties.old
    05/11/2011 10:08 AM 1,580 nm_service_classpath.txt
    07/09/2010 10:38 AM 156 nodemanager.domains
    03/09/2012 08:57 AM 0 nodemanager.log.lck
    03/09/2012 09:09 AM 1,902 nodemanager.log.old
    05/05/2010 08:55 AM 812 nodemanager.properties
    03/09/2012 08:57 AM 756 nodemanager.properties.old
    03/31/2010 03:46 PM 1,646 pack.cmd
    03/31/2010 03:46 PM 2,780 pack.sh
    04/14/2010 06:37 AM <DIR> security
    04/14/2010 07:14 AM <DIR> servers
    03/31/2010 03:46 PM 1,882 setPatchEnv.cmd
    03/31/2010 03:46 PM 2,176 setPatchEnv.sh
    03/31/2010 03:46 PM 3,889 startManagedWebLogic.cmd
    03/31/2010 03:46 PM 4,155 startManagedWebLogic.sh
    03/31/2010 03:46 PM 6,717 startPointBase.cmd
    03/31/2010 03:46 PM 5,722 startPointBase.sh
    03/31/2010 03:46 PM 3,170 startPointBaseConsole.cmd
    03/31/2010 03:46 PM 2,739 startPointBaseConsole.sh
    03/31/2010 03:46 PM 2,284 stopPointBase.cmd
    03/31/2010 03:46 PM 1,872 stopPointBase.sh
    04/14/2010 07:18 AM <DIR> tmp
    03/31/2010 03:46 PM 1,760 unpack.cmd
    03/31/2010 03:46 PM 2,210 unpack.sh
    03/31/2010 03:46 PM 3,272 upgrade.cmd
    03/31/2010 03:46 PM 3,382 upgrade.sh
    03/31/2010 03:46 PM 30,739 wlscontrol.sh
    03/31/2010 03:39 PM 773 wlsifconfig.cmd
    03/31/2010 03:39 PM 13,838 wlsifconfig.sh
    03/31/2010 03:46 PM 496 wlst.cmd
    03/31/2010 03:46 PM 680 wlst.sh

  • How to size a Scale-out File Server

    Hi,
    We are looking to implement a 2-node 2012 R2 Scale-out File Server cluster (using SAS JBOD enclosure) for the primary purpose of storing the VHD files that will be accessed by a 4-node 2012 R2 Hyper-V cluster using 10 gigabit Ethernet (no RDMA).  Our
    environment can be characterised as having a large number of mostly idle VMs that experience sporadic, low intensity use (this is *not* a VDI environment).  We have 2 questions.
    1) To what extent is RAM a consideration for the SoFS servers?  We can't find any documentation to suggest that there are benefits to be gained by having more RAM in the SoFS servers but we don't know if we should go with 8/16/32/64+ GB RAM in each
    of the nodes.
    2) With the need to keep costs down, we don't think RDMA / SMB-Direct NICs are going to be within our reach.  Should we however look to have 2 * dual-port 10 Gbps NICs in both the SoFS & Hyper-V boxes?

    Unless your VMs are read-intensive and you're going to deploy a CSV cache memory requirement for serving mostly idle VMs can be pretty low. However RAM is cheap these days so going for less then 16GB per node does not sound reasonable. Good sample, see:
    Windows Server 2012 File Server Tip: Enable CSV Caching on Scale-Out File Server Clusters
    http://blogs.technet.com/b/josebda/archive/2012/11/14/windows-server-2012-file-server-tip-enable-csv-caching-on-scale-out-file-server-clusters.aspx
    How to Enable
    CSV Cache
    http://blogs.msdn.com/b/clustering/archive/2013/07/19/10286676.aspx
    HYPER-V OVER SMB: SCALE-OUT FILE SERVER AND STORAGE SPACES
    http://www.thomasmaurer.ch/2013/08/hyper-v-over-smb-scale-out-file-server-and-storage-spaces/
    Hope this helped :)
    StarWind VSAN [Virtual SAN] clusters Hyper-V without SAS, Fibre Channel, SMB 3.0 or iSCSI, uses Ethernet to mirror internally mounted SATA disks between hosts.

  • Weblogic 8.1 Server log size increase in Production environment

    Hi,
    Issue:: One of the log file is increasing in size and exceeding beyond the size mentioned in the configuration file resulting in application outage.
    Issue description:
    We are having problems with the log size in the Weblogic 8.1 server. The fileminsize has been mentioned in the config.xml.
    New log files like MYsvr.log00001,MYsvr.log00002, MYsvr.log00003, MYsvr.log00004 etc are also being generated appropriately when the max file size has been reached. But simultaneously, one of the files is growing in size, exceeding the limit mentioned in the configuration file. Eg.. the MYsvr.log00001 file is 800MB in size while the other files(MYsvr.log00002, MYsvr.log00003 etc are 10MB in size)
    This increase in size of the log has been resulting in an application outage.
    More Details:
    1. Server: BEA Weblogic 8.1 server
    2. Log size is fine in other environements. This is a problem only in the production environment.
    3. The entry in the config.xml is as follows:
    <Server ListenPort="6313" Name="MYsvr" NativeIOEnabled="true" TransactionLogFilePrefix="./">
    <ServerStart Name="MYsvr"/>
    <Log FileMinSize="10000" FileName="MYsvr.log" Name="MYsvr"
    NumberOfFilesLimited="true" RotationType="bySize"/>
    <SSL Name="MYsvr"/>
    <ServerDebug Name="MYsvr"/>
    <WebServer Name="MYsvr"/>
    <ExecuteQueue Name="default" ThreadCount="15"/>
    <KernelDebug Name="MYsvr"/>
    </Server>
    Could you please help with this issue ?
    Thank you.

    Can someone please provide a solution for the issue

  • 903/Enterprise; log files grow boundlessly and weekly bounces are needed

    This feed back is the result of an eval of 903 IAS
    enterprise on Solaris, 2 box cluster.
    This feed back is FYI.
    After installing IAS Enterprise 903 many times, testing
    stop/start/reboot use cases I noticed that many internal
    log files grow boundlessly.
    I've seen the notes on turning down log levels. This
    is still far from adequate. Log files in a 24x365
    architecture must be managable and have finite maximum disk
    space consumption sizes setable.
    Framework error events also need to be hooked into an Enterprise
    error and alert protocol. SNMP is not my favorite, but it's
    a start. JMX is my favorite. Also check out the free
    Big Brother monitor; www.bb4.com and it's wire protocol.
    Thanks and best of luck,
    Curt

    Just curious whether other folks have any opinions on
    OEM's suitability for a high rel. environment?
    On log file management?
    curt

  • How to determine binary file data set size

    Hi all
    I am writing specific sets of array data to a binary file, appending each time so the file grows by one data set for each write operation.  I use the set file position function to make sure that I am at the end of the file each time.
    When I read the file, I want to read only the last 25 (or some number) data sets.  To do this, I figured on using the set file position function to place the file position to where it was 25 data sets from the end.  Easy math, right ?  Apparently not.
    Well, as I have been collecting file size data as I have started the initial tet run, I am finding the the file size (using file size command and getting number of bytes as a result) that the size is not growing the same amount every time.  My size and format of my data being written is identical each time, an array of four double precision numbers.
    The increments I get are as follows, after first write - 44 bytes, after 2nd - 52 bytes, 3rd - 52 bytes, 4th 44 bytes, 5th - 52 bytes, 6th - 52 bytes, 7th - 44 bytes and it appears to maintain this pattern going forward.
    Why would each write operation not be identical in size of bytes.  This means that my basic math for determining the correct file poistion to read only the last 25 data sets will not be simple and if somewhere along the line after I have accumulated hundreds or thousands of data sets, what if the pattern changes.
    Any help on why this is occuring or on a method of working around the problem would be much appreciated.
    Thanks
    Doug
    Doug
    "My only wish is that I am capable of learning each and every day until my last breath."
    Solved!
    Go to Solution.

    I have stripped out the DSC module functions from the vi and attached it here.  I also set default values to all the inputs so it will run with no other inputs.  I also included my current data files (zipped as I have four of them) though the file names are hard coded in the vi so they can be changed to whatever works locally. In fact probably will have to be to modified for the path anyway.
    If you point to a path that has no file, it will create a new one on the first run and the file size will show zero since there is no data in it. It will start to show the changes on each subsequent run.
    As I am creating and appending four different files, each with it's own set of data but always the same format (array of four double precision numbers) and the file size information always increments the same for all four files (as will be seen in the File Size Array) I don't think it is a function of the size of the acutal numbers but something idiosyncracy with how the binary file is created.
    If this proves to be a major hurdle I guess I could try a TDM file but I already got everything else working with this one and need to move on to other tasks.
    Thanks for any continued assistance
    Doug
    Doug
    "My only wish is that I am capable of learning each and every day until my last breath."
    Attachments:
    !_Data_Analysis_Simple.vi ‏40 KB
    SPC.zip ‏2 KB

  • Two machines saving the same file as different sizes?

    My coworker and I both have the same version of Illustratir (CS6) and both use Lion on iMacs, but today noticed something weird. He saved a file similar to a file I had done before as both an eps and a pdf and his file size was more than twice what my file sizes usually are. I thought it was odd, so I copied everything from his file into a new file (same dimensions) and saved out an eps and pdf (default settings), and like I thought, my files were less than half the size of his.
    Why would two machines be saving identical files at different sizes? Is there a setting somewhere I'm missing? Everything in the file is vector, if it matters. There's not even any editable text.

    See the mechanism of saving here: http://superuser.com/questions/66825/what-is-the-difference-between-size-and-size-on-disk
    The size of the "blocks" depends on the size of the disk and how it's been formatted.

  • How to add a border around a video (padding out to a larger size)?

    Hi,
    I'm having problems doing this in Quicktime Pro. I don't want to increase the size of the visible part of the video, just pad it out with black space. I created a gif file of the size I want and pasted it in, but rather than the video increasing to fit the pasted image, the opposite happened and the image was scaled down to fit the video. Any help gratefully appreciated.

    First issue is the GIF format (only 256 colors). Don't use that format.
    To add a "black" box (or any other color) to your existing QuickTime file you should first make your image file in a format (other than GIF) that also supports transparency.
    Depending on your image editing software the PNG format is the best choice. But JPEG will also work if your use 32 bit color (no transparency layer).
    To add this image to your existing QuickTime file follow these steps:
    Create the image file (1 bit for just black and white colors) and size it to the dimensions desired. JPEG or PNG will work nearly the same but JPEG doesn't support alpha channels.
    So, lets say your have a 320X240 QuickTime file but want a "box" around it sized at 320X320 (square).
    Create a 320X320 all black image saved as PNG or JPEG.
    Open that file with QuickTime Pro, select "all" (Control-A) and "copy'"(Control-C).
    Switch to your 320X240 QuickTime "video" file.
    Select all (Control-A) and move to the Edit menu. Pick "Add to Selection & Scale". This will add your image to the entire "video" portion.
    Control-J to open the Movie Properties window.
    Highlight (single click) on the "Video" track portion of your new composite video.
    Click the "Visual Settings" tab.
    Here is where you change to position of your video track in relationship to your image track.
    The "Offset" is what you need to adjust (because your video is smaller in dimensions than the image track).
    Default positioning is 0,0 (upper left). You need to add new values to the offset to set the position of the video track. It may take a few tries to get what you want. Changing the offset values until you get the size and shape you desire.
    Save As. Give the file a new name and save as "self-contained" to make a new file.
    These same steps can add "video backgrounds" to another "video" track.
    One of my pages to show a QuickTime example:
    http://homepage.mac.com/kkirkster/crosstown/index.html

  • Outlook 2010 - Data File Properties/Folder Size verses Windows Explorer pst file size

    I am running Outlook 2010 32bit Version 14.0.6129.5000 on a Windows PC running Windows 7 Professional.  All updates from MS are up to date.
    I have several pst files I open with Outlook 2010. The size of the files displayed in Outlook are very different than what is displayed in Windows Explorer. 
    For example one of the pst file called "business.pst" when opened Outlook displays it under "Data File Properties -> Folder Size" that the Total Size (including subfolders) is 764,354 KB.  Windows Explorer says
    the file size is 1,190,417 KB.
    For some reason MS Outlook 2010 is displaying the wrong folder size.  Any ideas why this is the case?
    Thanks,
    Pat

    Outlook mailbox grows as you create and receive items. When you delete items, the size of the Outlook Data File (.pst and .ost) file might not decrease in proportion to the data that you deleted, untill it has been compacted.
    Normally, after you have deleted items from an Outlook Data File (.pst), the file will be automatically compacted in the background when you’re not using your computer and Outlook is running.
    For an exception, when the Outlook Data File (.pst) is somehow corrupt, the compaction might not finish correctly. So the size of the Outlook Data File (.pst) file might remain the same before compaction.
    To solve this, try run the
    scanpst to fix the Outlook Data File (.pst) file first, after that, we can
    manually start the compact command.
    When finished, compare the file size again.
    Max Meng
    TechNet Community Support

  • How to set local file copy buffer size?

    Is there any sysctl parameter or any other mechanism to set or change file copy buffer sizes? I'm backing up a huge number of files to a local hard drive connected by firewire, and I'd like to play with file copy buffer sizes for the best performance. The machine used is a new macbook pro running OS-X 10.6.7. Any ideas?

    Here is a bash script that uses the dd command to copy a file.  Use the -b option to set the file size.
    example:
    /Users/mac/config/forumcopy.command -vb  4096  /Applications\ \(Mac\ OS\ 9\)/Civilization\ II/Civ\ II\ Map\ Editor  /Applications\ \(Mac\ OS\ 9\)/Civilization\ II/Civ\ II\ Map\ Editorv8
    I haven't tested this a lot. 
    Of course, I haven't figured out how best to post code.  Trying HTML mode. Using the <pre> tag.
    #!/bin/bash
    # macteracopy [ -b blocksize ] [-v ] [-V ] input_file output_file
    # Purpose of this script:
    # Copy a file with optional block size. Default size of 4096.
    # Notes:
    # chmod u+x macteracopy
    # You may have to restart the finder to notice a customized file icon.
    #   Copyright 2010 rccharles
    #   GNU General Public License
    #   This program is free software: you can redistribute it and/or modify
    #   it under the terms of the GNU General Public License as published by
    #   the Free Software Foundation,  version 3
    #   This program is distributed in the hope that it will be useful,
    #   but WITHOUT ANY WARRANTY; without even the implied warranty of
    #   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    #   GNU General Public License for more details.
    #   For a copy of the GNU General Public License see
    #   <http://www.gnu.org/licenses/>.
    # debug info
    export PS4='+(${BASH_SOURCE}:${LINENO}):'
    ## not in the tiger version of bash ${FUNCNAME[0]:+${FUNCNAME[0]}(): }'
    declare -x   verbose="No"   \
             veryVerbose="No"
    blockSize=4096
    # Check for command-line options used when calling the script
    if [ $# -gt 0 ] ; then
       while getopts "b:vV" Option ; do
          case "${Option}" in
          b  )  blockSize=$OPTARG
          v ) verbose="Yes"
          V ) veryVerbose="Yes"
          \? ) echo 'usage macteracopy -b blocksize -V input_file output_file'
          * ) echo "Unknown argument among arguments $* on command line."
             exit 6
          esac
       done
    fi
    # We're done with switches / options from the command line
    shift $(($OPTIND - 1))
    [ "${veryVerbose}" = "Yes" ] \
       && set -x  
    inputFile="${1}"
    outputFile="${2}"
    [ "${verbose}" = "Yes" ] \
       && echo \
       && echo "$0 script revised $(GetFileInfo -m $0)" \
       && echo
    [ ! -f "${inputFile}" ] && echo "File not found.  ${inputFile}" && exit 4
    [ -d "${inputFile}" ] \
       && echo "Directories are not supported, yet.  ${inputFile}" \
       && exit 4
    [ "${veryVerbose}" = "Yes" ] \
       && ulimit -a \
       && df \
       && echo
    if [ "${verbose}" = "Yes" ] ; then
       dd bs=${blockSize} if="${inputFile}" of="${outputFile}"
         dd bs=${blockSize} if="${inputFile}/rsrc" of="${outputFile}/rsrc" 
    else
          dd bs=${blockSize} if="${inputFile}" of="${outputFile}"
          dd bs=${blockSize} if="${inputFile}/rsrc" of="${outputFile}/rsrc"
        } 2>/dev/null
    fi
    [ "${verbose}" = "Yes" ] \
       && echo \
       && ls -l "${inputFile}" \
       && ls -l "${inputFile}"/rsrc \
       && GetFileInfo  "${inputFile}" \
       && ls -l "${outputFile}" \
       && ls -l "${outputFile}"/rsrc \
       && GetFileInfo  "${outputFile}" \
       && echo
    SetFile  -a $(GetFileInfo  -a "${inputFile}"  ) \
           -c $(GetFileInfo  -c "${inputFile}" | sed 's/"// g' ) \
             -t $(GetFileInfo  -t "${inputFile}" | sed 's/"// g' ) \
             "${outputFile}"
    [ "${verbose}" = "Yes" ] \
       && echo "after SetFile" \
       && GetFileInfo  "${outputFile}"

  • Why won't excel read my csv file created with the powersell out-file automatically.

    I wrote a PowerShell script that creates a CSV formatted file.  The script simply creates a comma delimited string for each entry and adds it to a collection.  Then the out-file command write it to a file.
    When you open it with Excel each line is put into one cell.  If you import the file and specify the "," as the delimiter, it imports just fine.  If the data is saved out again as a csv file,  the file is about half the size and opens
    just fine with excel.
    If you open the original file and the file created by Excel with notepad, they look the same.
    So the files are different in size, contents look the same, but Excel won't automatically open the original file.
    Any ideas of why this happens?  Also PowerShell and Excel are both running on Server 2012 R2.  Excel is a remote app.
    If I do an export-CSV, I get some kind of information about the object.
    Here is the script:
    foreach ($group in $groups)
       $header += '"' + $group.name + '",'
       $CSVdata = @()
       $CSVdata += $header
       # Create a user entry
       $users = Get-ADUser -filter * |select SamAccountname, Name | sort SamAccountName
       foreach ($user in $users)
          $Groupmembership = Get-ADPrincipalGroupMembership -Identity $user.SamAccountName
          $userentry = '"' + $user.SamAccountName + '",'
          $user
          foreach ($group in $groups)
              if ($Groupmembership.SamAccountName -contains $group.SamAccountName) { $userentry += '"X",'}
              else { $userentry += '"",' }
           $CSVdata += $userentry
          $CSVdata
          Out-File -inputobject $CSVdata -FilePath c:\batch\GroupMembership.csv

    Ok the script works exactly like I want it.  Thank you very much.
    I am trying to understand the script but I am unable to figure out what the line "$keys=$t.Keys|%{$_}"
    does.  I figured ".keys" was a method but my search for it comes up blank.  Do you have a reference you can point me to?  

  • Win 2008  WL 10.3.3 stdout appearing in .log and .out files

    Recently noticed a ballooning [ServerName].out file in the logs directory. In weblogic management console I do have it configured to redirect stdout and stderr to weblogic logging (.log file). Both the .log and .out file contain the same stdout/stderr information. I would like to eliminate the .out file if possible (since WL only rotates the .log), but cannot find where it is configured. The managed servers are NOT windows services (no -log option).
    Did not find any logging parameters in JAVA_OPTIONS or paramters in the startManagedSvc.cmd file.
    Is this something needing to be corrected at the application level? (log4j)

    opie wrote:
    Recently noticed a ballooning [ServerName].out file in the logs directory. In weblogic management console I do have it configured to redirect stdout and stderr to weblogic logging (.log file). Both the .log and .out file contain the same stdout/stderr information. I would like to eliminate the .out file if possible (since WL only rotates the .log), but cannot find where it is configured. The managed servers are NOT windows services (no -log option).
    Did not find any logging parameters in JAVA_OPTIONS or paramters in the startManagedSvc.cmd file.
    Is this something needing to be corrected at the application level? (log4j)Depends on what you are actually seeing in those files. Are you outputting log4j messages to a log file AND the console?
    Here is a snippet of the log4j configuration file that denotes writing to the console:
        <appender name="ConsoleAppender" class="org.apache.log4j.ConsoleAppender">
            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern" value="%d{yyyy-MM-dd hh:mm:ss} %-5p [%t] - %C{1}.%M -> %m%n" />
            </layout>
          </appender>
        <root>
            <level value="ALL" />
            <appender-ref ref="ConsoleAppender" />
        </root>Edited by: ForumKid2 on Dec 29, 2010 11:36 AM

  • Out-file -encoding default

    When utsing out-file to send output to a text file, in order not the get the BOM (Byte Order Marker) I have to use the form
    out-file -Encoding default .   Why is the default not the default?

    The default for Out-File is Unicode.
    EDIT: See here for more details:
    http://technet.microsoft.com/en-us/library/hh849882.aspx
    Don't retire TechNet! -
    (Don't give up yet - 12,830+ strong and growing)

Maybe you are looking for

  • How do i create an itunes account that can only download free apps

    how do i create an itunes account that can only download free apps

  • Pull image from library with AS3?

    Hello, I'm a fairly experienced AS2 programmer transitioning in progress to AS3 programmer. I'm using Flash CS4 as my IDE with Actionscript 3 and Flash Player 10. I'm working on a bigger project but the area I'm struggling in is dynamically pulling a

  • My iPhone 5 is in update limbo with iOS 7

    I downloaded the update to iTunes and then attempted to download the ios7 update. It failed a few times, but then it finally worked. When it installed, though, it failed, then it seemed like upon a second attempt it was working again, but I think it

  • Adress book doesn't search contacts

    Three times I tryed to recover Lion 10.7.2 /cmd/R/ to resolve problem in my contacts _address book.  In search field when I typing letter to find the name , Address book doesnt react. Double click on name only extra pop up window is open ,on opposite

  • IHeaderHelperBean isn't working - any ideas how to make it work?

    Hi, IHeaderHelperBean is one of the 3 or 4 predefined Spring Beans for Composites in SOA Suite. However, it isn't working for me as expected. I'm following the Oracle Tutorial by Ramkumar Menon "Predefined Spring Beans" available here: https://blogs.