Noob - Setting up the buffer

Hello,
I'm not sure of the extent of my question but can anyone point me toward some sample files / tutorials that explain how to set up the buffer indicator? I presume this question could break off into multiple genre's of buffering (ie. progressive buffering, streaming, dynamic streaming, etc). I'm not sure where to start or if most of this is already built into the OSMF framework. If someone could suggest a starting point that'd be great! Thanks again!

Scratch my question about MediaPlayer.bytesLoaded, MediaPlayer.bytesTotal. I didn't realize I was using a streaming video when I needed a progressive video for those values to take proper effect. So, I understand how the progressive download progress works. Although, I have my download indicator updating on CURRENT_TIME_CHANGE, which, I'm sure is prolly not the best listener to update the buffer on. What listener should I attach to the progressive download function?
As for streaming videos, I understand your comments about MediaPlayer.bufferTime / MediaPlayer.bufferLength but again, what listener should be attached to the function? Also, I don't grasp how to apply this particular bit of code to the UI. Any suggestions would be great! Thanks again Brian.

Similar Messages

  • Lync 2013 Edge - Windows Standard 2012 - Set-CSCertificate gives me "The buffer supplied to a function was too small."

    Hello,
    I'm having some issues during the installation of our new Edge 2013 server, specifically when trying to assign the external certificate.
    We have a Lync 2010 deployment already, and this is a step in the migration to the new version.
    On the 2010 Edge server, we have a Geotrust SAN certificate currently which it has been running nicely with for the past couple of years since we installed it.
    However, after trying to assign the certificate to the Lync 2013 Edge server, it just keeps giving me "set-cscertificate : Command execution failed: The buffer supplied to a function was too small."
    If I request a certificate from our Internal CA, it assigns fine and there's no problem - however I've gone over all the Subject Alternate Names on the Geotrust Cert and all of them are present, the certificate was exported and imported with the private
    key so that should not be the issue either. The common name on them are the same, and all the SAN's are there, along with quite a few others (Though I expect this should not present any problems.)
    We didnt have the intermediate Geotrust CA in the "Intermediate Certification Authorities" list, so I've imported that along with a current CRL but it still refuses to assign the certificate.
    Trying to find some more details on the error message seems rather futile - some more details to the error messages would have been helpful, but I'm hoping someone here might be able to give me a hand in diagnosing the actual issue.
    Thanks in advance,
    Johan

    In our case we traced the problem to the version of the certificate template. We could not utilize a v3 template from our Enterprise CA. Once the CA administrator configured and granted us the permissions to a v2 certificate template we were able to successfully
    assign a certificate to Lync.
    The problems comes in regarding the cryptography provider of the certificate template. Certificates based upon a v2 template utilize CryptoAPI (Cryptography API), and v3 templates utilize CNG (Cryptography API: Next Generation) as the cryptography
    provider.
    Lync Server 2010 and 2013 it appears, do not seem to utilize v3 certificates properly. This article explains how to determine which version of cryptography provider is being used by for the certificates in your environment:   http://www.ehloworld.com/751.
    You may consider checking the template version of your certificate to see if that helps your situation, perhaps Geotrust can reissue you a v2 certificate if necessary.
    Further background info:  http://msdn.microsoft.com/en-us/library/windows/desktop/bb931355(v=vs.85).aspx
    Regards,
    Jason
    Jason Hindson

  • Setting the buffer size centrally

              Is there a way to set the buffer size to a specific value for
              multiple JSPs without having to use the @page buffer...
              directive on every page?
              I have a JSP/EJB application with many JSPs and want to allow
              for an easy update of the buffer size in the future.
              Placing <@page buffer... in a separate file and using the
              @include directive or <jsp:include... action doesn't work.
              

    I had the same problem, and that's a great solution. Anyone know how to set
              the autoFlush property in an included file?
              "Gary Keim" <[email protected]> wrote in message
              news:3a96c2d8$[email protected]..
              > % cat Test.jsp
              > <%@ include file="SetBuffer.jsp" %>
              >
              > <html>
              > <body>
              > BufferSize is <%response.getBufferSize()%>
              > </body>
              > </html>
              > % cat SetBuffer.jsp
              > <%response.setBufferSize(16000);%>
              > --
              > Gary
              >
              > "Padraig O Broin" <[email protected]> wrote in message
              > news:3a969809$[email protected]..
              > >
              > > Is there a way to set the buffer size to a specific value for
              > > multiple JSPs without having to use the @page buffer...
              > > directive on every page?
              > >
              > > I have a JSP/EJB application with many JSPs and want to allow
              > > for an easy update of the buffer size in the future.
              > >
              > > Placing <@page buffer... in a separate file and using the
              > > @include directive or <jsp:include... action doesn't work.
              >
              >
              

  • How do I set up a 6534 buffer read to just return if the buffer is not full yet?

    I would like my 6534 buffer read and data analysis to run asynchronosly with other data acquisition tasks. The problem is that it waits until the buffer is done before returning and everything else is stopped. I have tried several techniques to program a continuous buffer read and monitor the state of the buffer immediately so I can skip my data analysis unless the buffer is full but nothing works.
    I have tried:
    Setting number of scans to acquire to zero and monitoring the scan backlog.
    Setting number of scans to acquire to zero and monitoring the acquisition state in mark locations.
    Setting number of scans to acquire to the previous scan backlog. This method requires a second circula
    r buffer that I must manage.
    Of course, with all of these attempts, the DIO start was set to 0 number of scans to acquire to make it continuous.
    Is there an example of this?

    Although I haven't tried this for buffered digital channels, it works for analog input channels, so give it a shot.
    When using analog channels, before you call the AI Start VI, you can wire a vi located in Data Acquisition -> Calibration & Configuration called DAQ Occurrence Config vi. This VI will set an occurrence (like an interrupt) when the buffer reaches a certain size, and then you can perform your read of the buffer accordingly.
    The best way to see an example of this VI is to go to LabVIEW/examples/DAQ/anlogin/anlogin.llb/Cont Acq&Chart (Asynch Occurrence). The occurrence will allow the data acquisition thread to still be open, and allow the CPU to perform other tasks while waiting for the occurrence to be set. This sounds exactly like what
    you want.
    I just checked the examples/DAQ/digital directory, and there is an example with continuous digital input which uses the DAQ occurrence VI as well, so this should work for you.
    Mark

  • Setting the buffer time on an flv

    My file displays an external flv using the flv component, no
    matter what I set the buffer to it still plays straight away. I
    want a 2 or 3 second pause while the video is loading before the
    movie plays - any ideas?
    gryllsie

    double[] myArray = new double[10];
    for (int i = 0; i < 10; i++)
    myArray[i] = i % 4;
    NationalInstruments.UI.AnalogWaveformPlotOptions plotOptions = new NationalInstruments.UI.AnalogWaveformPlotOptions(
    NationalInstruments.UI.AnalogWaveformPlotDisplayMode.Time,
    NationalInstruments.UI.AnalogWaveformPlotScaleMode.Scaled,
    NationalInstruments.UI.AnalogWaveformPlotTimingMode.Auto);
    waveformGraph1.Plots[0].DefaultTiming = NationalInstruments.WaveformTiming.CreateWithRegularInterval(
    TimeSpan.FromSeconds(1),
    DateTime.Now,
    TimeSpan.FromSeconds(0));
    NationalInstruments.AnalogWaveform<double> myWaveform = NationalInstruments.AnalogWaveform<double>.FromArray1D(myArray);
    waveformGraph1.PlotWaveform<double>(myWaveform, plotOptions);
    Plots something that looks like:
    National Instruments
    Product Support Engineer

  • Logic 7.1 - Where's the buffer setting for DTDM/DAE?

    Seems that I cannot find the buffer to vary my Digi HD setup. I can see the buffer process manager at the bottom of the driver setup menu (small,medium,large) but I would love to find the usual 128-1024 options that are the usual standard offerings. Any ideas? Is it hidden somewhere else or handled by " Digi Coreaudio Manager"? I'd love to minimize some of the artifacts I'm experiencing on larger instantiated files ....

    Hello BassMan,
    I don't know about the ProTools HD setup and I may be writing bull**, however in older systems you could change the buffer size just by opening the DAE program directly. If your Digi runs on DAE, may be it is worth a try.
    Good luck,
    Mac512 (old, B&W, no scsi nor adb, but still running well)

  • Linux Serial NI-VISA - Can the buffer size be changed from 4096?

    I am communicating with a serial device on Linux, using LV 7.0 and NI-VISA. About a year and a half ago I had asked customer support if it was possible to change the buffer size for serial communication. At that time I was using NI-VISA 3.0. In my program the VISA function for setting the buffer size would send back an error of 1073676424, and the buffer would always remain at 4096, no matter what value was input into the buffer size control. The answer to this problem was that the error code was just a warning, letting you know that you could not change the buffer size on a Linux machine, and 4096 bytes was the pre-set buffer size (unchangeable). According to the person who was helping me: "The reason that it doesn't work on those platforms (Linux, Solaris, Mac OSX) is that is it simply unavailable in the POSIX serial API that VISA uses on these operating systems."
    Now I have upgraded to NI-VISA 3.4 and I am asking the same question. I notice that an error code is no longer sent when I input different values for the buffer size. However, in my program, the bytes returned from the device max out at 4096, no matter what value I input into the buffer size control. So, has VISA changed, and it is now possible to change the buffer size, but I am setting it up wrong? Or, have the error codes changed, but it is still not possible to change the buffer size on a Linux machine with NI-VISA?
    Thanks,
    Sam

    The buffer size still can't be set, but it seems that we are no longer returning the warning. We'll see if we can get the warning back for the next version of VISA.
    Thanks,
    Josh

  • How do I know when the buffer flushed all the data out?

    I am using a very high sampling rate (500000 Hz) and acquire 1024 data points continuously.   It takes 370000 data points in 10 second.   I use a counter to help with the retrigger PFI line.   I have a huge buffer so that I can make sure that the buffer is not overflowed.  The code is attached below.  My problem is that the data acquisition is done so fast (in 10 seconds)  but the processing of the data is not.  In :nEvent, I basically save and plot the data.  The saving process is not slow.   However, our videocard is so SSSSLOOOW and can not keep up with realtime data display.    After the user is done collecting the data, they do not want to wait for the screen to plot the data from the buffer.   So after the data collection is done, I basically stop the plotting process but we still need to flush the data out from the buffer for saving.  My question is that how can I tell when the buffer is empty.
    Thanks,
    Yajai
    m_task = std::auto_ptr<CNiDAQmxTask>(new CNiDAQmxTask("aiTask"));
    m_counter = std::auto_ptr<CNiDAQmxTask>(new CNiDAQmxTask("coTask"));
    m_task->Stream.Timeout = -1;
    //Create a channel
    m_task->AIChannels.CreateVoltageChannel(physicalChannel, "",
    static_cast<DAQmxAITerminalConfiguration>(DAQmxAITerminalConfigurationRse), minimum, maximum,
    DAQmxAIVoltageUnitsVolts);
    m_task->Timing.ConfigureSampleClock(counterSource, sampleRate,DAQmxSampleClockActiveEdgeRising,DAQmxSampleQuantityModeContinuousSamples, samplesPerChannel);
    m_task->Stream.Buffer.InputBufferSize = samplesPerChannel * 2000;
    m_counter->COChannels.CreatePulseChannelFrequency(counterChannel, "coChannel", DAQmxCOPulseFrequencyUnitsHertz, DAQmxCOPulseIdleStateLow, 0, sampleRate, 0.5);
    m_counter->Timing.ConfigureImplicit(DAQmxSampleQuantityModeFiniteSamples, samplesPerChannel);
    m_task->Control(DAQmxTaskVerify);
    m_counter->Control(DAQmxTaskVerify);
    m_counter->Triggers.StartTrigger.ConfigureDigitalEdgeTrigger(
    referenceTriggerSource, DAQmxDigitalEdgeStartTriggerEdgeRising);
    m_counter->Triggers.StartTrigger.Retriggerable = true;
    m_taskRunning = true;
    m_counter->Start();
    // Set up the graph
    m_Graph.Plots.RemoveAll();
    for (unsigned int i = 0; i < m_task->AIChannels.Count; i++)
    m_Graph.Plots.Add();
    m_Graph.Plots.Item(i+1).LineColor = m_colors[i % 8];
    // Create Multi-channel Reader
    m_reader = std::auto_ptr<CNiDAQmxAnalogMultiChannelReader>(new CNiDAQmxAnalogMultiChannelReader(m_task->Stream));
    m_reader->InstallEventHandler(*this, OnEvent);
    m_reader->ReadMultiSampleAsync(samplesPerChannel, m_data);

    Yajai,
    I'm a little confused about your acquisiton. Do you intend for it to be
    finite, or continuous? I'm also unclear about your rates. You state
    that you are acquiring 1024 samples at 500kHz, yet you get only 370k
    samples in 10 seconds. Are you periodically acquiring 1024 samples at
    500kHz?  Do you do any reads other than the final m_reader->ReadMultiSampleAsync(samplesPerChannel, m_data)? Could you provide the code where you stop the plotting process?
    Thanks,
    Ryan V.
    Ryan Verret
    Product Marketing Engineer
    Signal Generators
    National Instruments

  • Where can I change the buffer size for LKM File to Oracle (EXTRENAL TABLE)?

    Hi all,
    I'd a problem on the buffer size the "LKM File to Oracle (EXTRENAL TABLE)" as follow:
    2801 : 72000 : java.sql.SQLException: ORA-12801: error signaled in parallel query server P000
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04020: found record longer than buffer size supported, 524288, in D:\OraHome_1\oracledi\demo\file\PARTIAL_SHPT_FIXED_NHF.dat
    Do you know where can I change the buffer size?
    Remarks: The size of the file is ~2Mb.
    Tao

    Hi,
    The behavior is explained in Bug 4304609 .
    You will encounter ORA-29400 & KUP-04020 errors if the RECORDSIZE clause in the access parameters for the ORACLE_LOADER access driver is larger than 10MB and you are loading records larger than 10MB. Which means their is a another limitation on read size of a record which is termed as granule size. If the default granule size is less then RECORDSIZE it limits the size of the read buffer to granule size.
    Use the pxxtgranule_size parameter to change the size of the granule to a number larger than the size specified for the read buffer.You can use below query to determine the current size of the granule.
    SELECT KSPFTCTXPN PARAMETER_NUMBER,
    KSPPINM PARAMETER_NAME,
    KSPPITY PARAMETER_TYPE,
    KSPFTCTXVL PARAMETER_VALUE,
    KSPFTCTXDF IS_DEFAULT,
    KSPPIFLG MODIFICATION_FLAG,
    KSPFTCTXVF VALUE_FLAG
    FROM X$KSPPI X, X$KSPPCV2 Y
    WHERE (X.INDX+1) = KSPFTCTXPN AND
    KSPPINM LIKE '%_px_xtgranule_size%';
    There is no 'ideal' or recommended value for pxxtgranule_size parameter, it is safe to increase it to work around this particular problem. You can set this parameter using ALTER SESSION/SYSTEM command.
    SQL> alter system set "_px_xtgranule_size"=10000;
    Thanks,
    Sutirtha

  • VISA Set I/O Buffer Size fails with all but one value on Linux RT

    I was unable to initialize a serial port on a cRIO-9030 using a code that works fine on VxWorks and Windows, when I tracked it down to this somewhat strange behaviour;
    If you call VISA Set I/O Buffer Size on Linux RT (at least on the 9030 device) you will get error code 1073676424 for all size values other than 0.
    That is a bit strange (what will the buffer size be then I might add...), but something even uglier is that if you leave the function's buffer size unwire,  you will also get the error (because the function's default is 4096). 
    MTO

    Under the hood VISA is using the POSIX serial interface for Mac OS X (same as for Linux and Solaris). This interface does not support changing the buffer size. Hence, the buffer size is fixed to the internal OS buffer size. The only thing that changing the buffer size will do (for the out buffer) is to have VISA not flush the data after every write. This is a limitation in the serial API for Mac OS X. Therefore, VISA reports a warning.

  • Clearing the buffer when reading a 'few' samples

    Hello,
    I am looking to develop a system that samples a number of data channels at ~50KS/s, reading the data into a buffer.  Every 10ms I want to read just a few samples to check if a condition threshold has been passed - eg the pressure has gone above 10kPa.
    If the condition has occured, I want to read all the contents of the buffer so I then have the condition of the system for the last ~5seconds leading up to the event.
    - I have set a Task up to read a finite number of samples at 50Ks/s into a buffer which works out to be 5s long.
    - Every few milliseconds I use the Read command to just read the most recent bit of data, so I set the 'read number of samples'  to 100 or so.
    If the process routine detects an error it creates an event which issues a Read command with teh 'read number of samples' set to -1, ie all.
    The problem is though, will the Read All command then sit there for 5seconds gathering a buffer full of data when what I want it to do is give me the last 5 seconds of data?
    I hope this makes sense, I would appreciate any thoughts.
    Best regards,
    Martin

    Is all of this happeing in one loop? Sounds like you need (at least) two separate loops, in a producer/consumer structure. With queues, you can peek at data without stopping it or pausing. One queue could hold hold all the data while you skim off the top with your current method. When you want to see all the data, just read the queue. Look up Producer/Consumer and/or queues and notifiers.
    Good luck with your project!
    Richard

  • How do you clear the buffer of excess key presses, in Java?

    Right now, in the game I am working on, if the user presses a key multiple times the actions will be performed as many times as it registers, even though I disallow key presses during the time that the action is taking place.
    For example, a move of 1 space can be carried out with a key press. While the character is moving, key presses are not allowed. However, if you just press the key 5 or 25 times, the character will just keep moving, once he's done with the first one.
    What I need to do is clear the buffer of key presses after an action has taken place, so that the excess keys are not registered. I know the commands to do so in C++, but not in Java.

    Well, I think you've got what I was saying backwards, or I'm not understanding what you are asking correctly.
    it is suppposed to return if isMoving is true, not false. If isMoving is true, it means that the moving animation, and actions are taking place, Which is a process that take about a second, for the player to move one space.
    Here's the code where I process the keypresses
    private void processKey(KeyEvent e) {
        if (isMoving) {return;}
        int keyCode = e.getKeyCode();
        if (!isPaused && !gameOver) {
            isMoving = true;
          if (keyCode == KeyEvent.VK_UP) {
            player.move(player.NE);
          else if (keyCode == KeyEvent.VK_DOWN) {
            player.move(player.SW);
          else if (keyCode == KeyEvent.VK_LEFT) {
            player.move(player.NW);
          else if (keyCode == KeyEvent.VK_RIGHT) {
            player.move(player.SE);
            isMoving = false;
    }And in case I need to clear up what I want to happen, and the results I'm seeing. The player, in this case, moves one space in a direction, with a single press of an arrow key. That functionality is working fine, as intended. However, I don't want you to press the key twice quickly and move two spaces. I don't want you to hold down an arrow key and move continually.
    That part of the functionality is what is not working.
    As I understand my code, pressing the key once should set isMoving to true and, therefore further presses shouldn't register until isMoving is false again, which shouldn't happen until the player moves have finished.
    But, the way that it is working now is that if I press the key 2, 3 or 20 times, as soon as one move is finished another begins. Are the key presses just sitting in the buffer somewhere waiting for isMoving to become false again? And if so, is there a function I can call to clear that buffer after a move is finished?

  • Unused tables in the buffer cache

    I have a program that queries the V$BH view every hour and stores the name of every table in the buffer cache. There is a set of tables that are never used which are appearing in the buffer cache every hour. I did a case insensitive search of the V$SQL and V$SQLTEXT views for these table names, but found nothing. How can tables be put in the buffer cache if there is no SQL statement refering to them? I'd like to find the session and program which is using these tables that no one is supposed to be using.
    Kevin Tyson

    This can be due to recursive SQL. Means the SQL not fired directly by the app but fired by oracle to satistfy some requirment. It can be related to some system tables or other users tables.
    Example of system tables:
    Oracle use system tables to reflect the current state of the database like when you insert records it update the extent information and when you create the object it update the data dictionary so that object reflect in that etc etc etc
    Example of users tables:
    you fired an insert statement to insert one record in table emp but to insert that record oracle has to check the state of foreign key data by querying dept and other tables so enev if you didn't specify dept table oracle use it internally to check the integrity constraint.
    Daljit Singh

  • How to add request in the buffer

    Hii All,
    How to add request in the buffer in bulk.
    i am using following command.
    tp addtobuffer requestno <sid>
    but after doing it is giving me the following error
    C:\usr\sap\trans\bin>tp         addtobuffer     BWDK900161      BT1
    ^CThis is tp version 340.07 (release 640, unicode enabled)
    E-TPSETTINGS could not be opened.
    EXIT
    ERROR: System : Parameter SAPEVTPATH not set. Batch jobs cannot be started.
    Error in TPSETTINGS: transdir not set.
    tp returncode summary:
    TOOLS: Highest return code of single steps was: 0
    ERRORS: Highest tp internal error was: 0208
    tp finished with return code: 208
    meaning:
      error in transportprofil (param missing, unknown,...)
    Pls help.
    Regards,
    Viren.

    Hi Rolf,
    thax for your reply.
    But still it is not working it shows following message.
    C:\usr\sap\trans\bin>tp addtobuffer BWDK900051 BT1 pf=c\usr\sap\trans\bin\TP_DOM
    AIN_BT1.PFL
    This is tp version 340.07 (release 640, unicode enabled)
    E-c\usr\sap\trans\bin\TP_DOMAIN_BT1.PFL could not be opened.
    EXIT
    ERROR: System : Parameter SAPEVTPATH not set. Batch jobs cannot be started.
    Error in c\usr\sap\trans\bin\TP_DOMAIN_BT1.PFL: transdir not set.
    tp returncode summary:
    TOOLS: Highest return code of single steps was: 0
    ERRORS: Highest tp internal error was: 0208
    tp finished with return code: 208
    meaning:
      error in transportprofil (param missing, unknown,...)
    Regards,
    Viren.

  • PLEASE can a AE from NI take a look at my problem. Sound input read behave in strange manner then the buffer size is larger than 2X number of samples to read.

    On my computer I have discovered some strange behavior then reading data from the sound card. Then the buffer size is 2x samples to read everything is as expected. But since I read the sound card 10 times pr second I feel a .2 second buffer is to small. I am using XP, and XP is not a RTOS so with a buffer set to 0.2 seconds I may lose data. Therefore I set the buffer size (number samples/ch on Sound Input Configure.vi) to be in range of 2 seconds. The result then is that then reading from Sound input.vi, a reading often take more than 0.1 second. On my computer it is often 500mSec. Then the next 5 read follows with almost zero interval. I do not loose data. But on my front panel the graphs looks like an very early silent movie. This error was introduced in Labview 8.x. To be honest I think the labview 7.x sound system was much better in many ways.
    But before I point any finger NI. Other people has to verify the behavior I experience. I have made an example showing this error. It is a modified version  of the "Continuous Sound Input.vi" example. Then the "buffer in seconds" control is set to 0.2 every thing works OK. Changer this to a larger number will produce the mentioned above hiccup. The larger number in this control the larger hiccup. Is it any way to fix this? My solution up to now has to use a free 3. part software(http://www.zeitnitz.de/Christian/index.php?sel=wav​eio) But I guess it soon will be outdated. It may not work with newer windows versions.
    Any help at all will be appreciated 
    And yes I have the most updated version fo DirectX. Also I se this in Labview 2009 which I have trail version of. The VI I have made is in 8.6
    Message Edited by Coq Rouge on 09-07-2009 10:54 AM
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)
    Attachments:
    Continuous Sound Input with timing.vi ‏23 KB

    macaba wrote:
    If you take a moving average of the 0.2s buffer vs. 3s buffer at an update rate of 10, then they are the same (just under 100ms), so the average refresh rate is the same. I agree that is odd behaviour that the time between sound reads go to zero quite a lot then take a long time once in a while (presumably to fill the buffer
    I guess it goes to zero because it is reading data from the buffer it do not has to wait for data from the sound card. The mysterious thing is the periodic delay. You are also correct then saying that average timing is correct. And in my application I have no data loss.
    If you search for sound in this forum you will find out that many people has reported trouble with the sound system.
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

Maybe you are looking for

  • How can I connect one picture to another picture?

    How can I connect three pictures?

  • How to set up rules for emails to go to junk on iphone 4

    i have an iphone4 that i use for work and a coworker sent out over 40,000 emails which has now overloaded my email to where i cannot send or recieve emails can i set up rules for his emails to go straight to junk? and how my email address is linked t

  • Saving dates in the database

    How do I save a date in the database? My code is as follows: int year = 2002; int month = 2; int day = 3; java.util.Date sdate = new java.util.Date(year, month, day); PreparedStatement ps = "INSERT INTO period_table(pe_start) VALUES (?)"; ps.setDate(

  • "The file is not a movie file."

    Hey Everyone, I have a series of .avi files that, up until recently, I was able to watch using QuickTime and on my television through my EyeHome box... Due to a hard drive failure, I had to copy these files to another hard drive. Now when I double-cl

  • I have a unknown person's macbook connected to my network. How do I remove it?

    I have an unknown person's macbook connected to my computer as a "shared" directory.  How can I remove it?  i'm signed in as my computer's administrator, but it says I don't have sufficient permissions to remove it...