File or Memory Stream Access through Flex

Dear all
I create a desktop sharing application that is written in Flex. For desktop sharing I have a component that generates a series of images of a selected screen area, where only the changed screan areas will be transmitted. I would like to pass these images to flex/flash player to send it eihter over P2P or a Flash Media Server to the listening clients. My question is:
- Can Flex/Flash attach to a memory stream object that is generated in a C++ component? Does a bridge exist here that is performant enough?
- if this bridge is not possible, can Flex/Flash read from files of the hard disk? As far as I know a local-trusted sandbox is needed. How does this look like?
Thanks in advance for your help,
Marc

hi,
If you want a true desktop application you can use Flex and compile it to a native desktop application using Air. If you Want to read local files from the hard disk and upload them you can do this with a normal flex/flashplayer 10 browser application. There is no restriction on what data files you can open.
The following flex app opens images on the local drive and displays them, just to give you an idea of local file access from a browser app.
http://gumbo.flashhub.net/pagedrop/ -source included

Similar Messages

  • Recording video/audio files using Flash Meida Server through rmtp, and allow users to access the recorded files through http.

    As titled, what is the way to record video/audio files using Flash Meida Server through rmtp, and allow users to access the recorded files through http?
    What I am trying to do, is to record a user's microphone's input and save it to the server. Afterwards, I would like other users to be able to access the recorded files and mainuplating the audio data, by computeSpectrum(), to do some visualization of the audio. As I know computeSpectrum() cannot work on streaming files, so I think I need to access the recorded files using http instead of rmtp. Is that true?
    How can I redirect the http request to the files I was recorded into my applications/appName folder? Or I need to somehow moved the recorded files to the /webroot folder?
    Thanks!

    I probably have asked a stupid question.
    My recorded streams are still saved in the applications/appName/streams folder.
    And I redirect the www.mydomain.com/streams to point to the applications/appName/streams folder.
    And the rmtp recorded streams are abled to connect through http now.

  • Can I use the new Time Capsule to backup my mid 2010 Macbook Pro? Also can I want to free up my hard disk, can I save my photos and files on the time capsule and later access through wifi?

    Can I use the new Time Capsule to backup my mid 2010 Macbook Pro? Also can I want to free up my hard disk, can I save my photos and files on the time capsule and later access through wifi?

    Can I use the new Time Capsule to backup my mid 2010 Macbook Pro?
    Yes, if you are asking about using Time Machine to backup the Mac.
    Also can I want to free up my hard disk, can I save my photos and files on the time capsule and later access through wifi?
    You are not thinking of deleting the photos and files on your Mac, are you?  If you do this, you will have no backups for those files.
    Another concern is that Time Machine backs up the changes on your Mac. At some point, Time Machine will automatically delete the photos and files from the Time Capsule.....you just don't know when this might occur.
    In other words, only delete files from your Mac that you can afford to lose.

  • Java Files inside the jar file cannot be read or accessed through Eclpse

    The Java File inside the jar file of the eclipse cannot be accessed through the Eclipse . It shows error for the modules in the jar file .
    But when compiled with adding the jar files to the class path the compilation is successful .
    How can i browse through the files in the jar like going into the function definition .
    TIA ,
    Imran

    Open MPlayer OSX and show the playlist (command-shift-p) then drag the file into the playlist. Click the file in the playlist and click the "i"nfo button. What does it list for file, video and audio format?
    Not all of the codecs used in avi and wmv files are available for OS X so I'm guessing your file happens to be using one that isn't...

  • Extended Analytics - Is it possible to extract data from HFM to MS Access through an UDL (OBDC) file?

    Hello,
    I am trying to extract data from HFM, using Extended Analytics, to MS Access through an UDL (OBDC) file but it displays the following error:
    "error occurred while connecting to the database"
    I've followed these steps:
    1. Create a blank MS Access database (.mdb format)
    2. Create an .UDL file which connects to MS Access database:
          - Provider: Microsoft Jet 4.0 OLE DB Provider
          - Connection: Linked to the .mdb database created in step 1.
          - Give "ReadWrite" permissions.
    3. Test .UDL file connection successfully.
    4. Create DNS with Extended Analytics DNS Configuration tool.
    5. In Extended Analytics (HFM), once selected the new DNS and extract it displays the error commented before.
    ¿Do you know how i can solve this issue?
    Thank you in advance,
    Best regards,

    Hello Anjum Ara,
    Thank you for your response.
    I don't understand how to add my database SID to tnsnames.ora. (I already found the file but i don't know how to add new database SID)
    I have created one MS Access database in "C:\TEST.mdb" and i want to connect Extended Analytics to it.
    How i have to add this database into tnsnames.ora file?
    Thank you in advance,
    Kind regards

  • Random access file with object stream

    Hi All,
    I have saved some objects in a file using object stream. While reading the sequential read is possible for me. I want to reduce the overhead of reading whole file for getting one specific object details. Is there any way in which we can implement RandomAccessFile with object stream? or is there any other alternative of reaching specific object in the file without seqential read?

    is there any
    other alternative of reaching specific object in the
    file without seqential read?Write the objects to a database.

  • How to render PDF in InfoPath whose source is a memory stream

    I'm struggling to interface InfoPath calling a Web service that generates a PDF and rendering the PDF for the user.
    The document workflow is:
    Using InfoPath (using both the fat client and browser client), display a form for the user to complete.
    Collect the entered information and call the Opus Web Service (Elixir) to render a PDF.
    The PDF is sent back to InfoPath through the web service call.
    Open the PDF for the user and provide them the ability to review and/or print the document.
    It's this last step that's causing the problems -- It's no problem when this is all part of a web page. This work flow is used for thousands of PDF reviews via web pages daily.  However, because I have to use InfoPath, I'm struggling to find an easy solution.  I can't write the returned data to disk and open the document because the InfoPath Browser is really tied into SharePoint and I don't want to store temporary PDF files on the SharePoint server.
    My issue is : How to render a PDF document that is sourced as a memory stream.
    I'm down to a couple of possible solutions and have some questions:
    1) I looked at writing a plug-in to support another ASFileSys interface which would be used to read from memory instead of from disk. But I don't see a way to invoke my plug-in through an API.  Is this possible?
    2) I looked at the OLE interfaces, but didn't see a nice way to pass a document in through memory.  Is there an interface that accomplishes this task?
    3) I did see that I could create a compound document in memory and then call the OLE interfaces.  Has anyone done this and is it maintainable ? 
    4) The OLE format also lends itself to placing the PDF inside of an RTF to display. Seems like a lot of work to put a wrapper around the PDF, but I do think that also alleviates the need to register any components in SharePoint.  Again, has anyone seen this done?
    In the end, none of these seem very straight forward. So I have to ask, am I missing something obvious here? 
    Thanks for your help,
    Chris Andrews

    Infopath has a fat client interface and a browser version where the server is a sharepoint server.  So the web service executes on the local desktop or in the Sharepoint server depending upon which interface is used.  The best way to work on InfoPath solutions is to focus on the fat client solution with a couple of caveats.  Asume that you can't create temp files, and assume that you can't invoke a new application, and that you aren't in an HTTP request/response environment (since Sharepoint is a wrapper around the back end browser interface).  You can invoke code that is in DLLs, OCX, TLB, etc... (typical Active X, or library code).
    Chris

  • How to open a file created at the server through form/report at client end

    How to open a file created at the server through form/report at client end
    Dear Sir/Madame,
    I am creating a exception report at the server-end using utl file utility. I want to display this report at the client end. A user doesn't have any access to server. Will u please write me the solution and oblige me.
    Thanks
    Rajesh Jain

    One way of doing this is to write a PL/SQL procedure that uses UTL_FILE to read the file and DBMS_OUTPUT to display the contents to the users.
    Cheers, APC

  • How to load a BDB file into memory?

    The entire BDB database needs to reside in memory for performance reasons, it needs to be in memory all the time, not paged in on demand. The physical memory and virtual process address space are large enough to hold this file. How can I load it into memory just before accessing the first entry? I've read the C++ API reference, and it seems that I can do the following:
    1, Create a DB environment;
    2, Call DB_ENV->set_cachesize() to set a memory pool large enough to hold the BDB file;
    3, Call DB_MPOOLFILE->open() to open the BDB file in memory pool of that DB environment;
    4, Create a DB handle in that DB environment and open the BDB file (again) via this DB handle.
    My questions are:
    1, Is there a more elegant way instead of using that DB environment? If the DB environment is a must, then:
    2, Does step 3 above load the BDB file into memory pool or just reserve enough space for that file?
    Thanks in advance,
    Feng

    Hello,
    Does the documentation on "Memory-only or Flash configurations" at:
    http://download.oracle.com/docs/cd/E17076_02/html/programmer_reference/program_ram.html
    answer the question?
    From there we have:
    By default, databases are periodically flushed from the Berkeley DB memory cache to backing physical files in the filesystem. To keep databases from being written to backing physical files, pass the DB_MPOOL_NOFILE flag to the DB_MPOOLFILE->set_flags() method. This flag implies the application's databases must fit entirely in the Berkeley DB cache, of course. To avoid a database file growing to consume the entire cache, applications can limit the size of individual databases in the cache by calling the DB_MPOOLFILE->set_maxsize() method.
    Thanks,
    Sandra

  • Guest LDOM disk access through multiple IO domains

    Hi All,
    I am working on a configuration, wherein the boot disk to the Guest LDOM is being provided through an image file hosted on a VxVM diskgroup (vmdg1). The configuration has another copy of the same image file being provided through another VxVM diskgroup (vmdg2)l through another virtual disk service.
    let me clear the configuration a little more in detail
    A T-5240 server having 2 IO domains configured
    Primary (Control domain + IO domain + Service domain) configuration
    A VxVM diskgrop (vmdg1) having a boot image file
    Secondary (IO domain + Service domain) configuration
    A VxVM diskgroup (vmdg2) having a copy of the boot image file
    these devices are exported through their respective virtual disk services with the same mpgroup name to a guest LDOM. The vdsisk is then assigned to the guest ldom which is using the volume through the primary service.
    When the guest LDOM is started it starts with the disk export through the Primary domain. All writes happen fine. When the VxVM diskgroup is deported from the Primary the Guest LDOM still remains online as starts using the disk image path through the secondary domain.
    I then bring the VxVM diskgroup and the mounts back online on the Primary domain and deport the diskgroup from the secondary domain to see if it failsback to the image through the Primary domain. The Guest LDOM now is in a hung state and does not allow access through the local console or through network logins.
    Has anyone see such a problem? Also is it recommended to use disk based image as a backend device in mpgroups through "ldm add-vdsdev" ?
    TIA,
    Sudhir

    As far as I know then only way to "re-balance" the I/O across the domains is to unbind/bind the guests. Not a great answer, but this could be done as part of the guests patching cycle.
    I think there is an RFE to provide MPxIO-like features to guests.

  • Windows 7, file properties - Is "date accessed" ALWAYS 100% accurate?

    Hello,
    Here's the situation: I went on vacation for a couple of weeks, but before I left, I took the harddrive out of my computer and hid it in a different location. Upon coming back on Monday (January 10, 2011) and putting the harddrive back in my computer, I
    right-clicked on different files to see their properties. Interestingly enough, several files had been accessed during the time I was gone! I right-clicked different files in various locations on the harddrive, and all of these suspect files had been accessed
    within a certain time range (Sunday, ‎January ‎09, ‎2011, approximately ‏‎between 6:52:16 PM - 9:06:05 PM). Some of them had been accessed at the exact same time--down to the very second. This makes me think that someone must have done
    a search on my harddrive for certain types of files and then copied all those files to some other medium. The Windows 7 installation on this harddrive is password protected, but NOT encrypted, so they could have easily put the harddrive into an enclosure/toaster
    to access it from a different computer.
    Of course I did not right-click every single file on my computer, but did so in different folders. For instance, one of the folders I went through has different types of files: .mp3, ,prproj, .3gp, .mpg, .wmv, .xmp, .txt with file-sizes ranging from 2 KB
    to 29.7 MB (there is also a sub-folder in this folder which contains only .jpg files); however, of all these different types of files in this folder and its subfolder, all of them had been accessed (including the .jpg files from the sub-folder) EXCEPT the
    .mp3 files (if it makes any difference, the .mp3 files in this folder range in size from 187 KB to 4881 KB). Additionally, this sub-folder which contained only .jpg files (48 .jpg files to be exact) was not accessed during this time--only the .jpg files within
    it were accessed-- (between 6:57:03 PM - 6:57:08 PM).
    I thought that perhaps this was some kind of Windows glitch that was displaying the wrong access date, but then I looked at the "date created" and "date modified" for all of these files in question, and their created/modified dates and
    times were spot on correct.
    My first thought was that someone put the harddrive into an enclosure/toaster and viewed the files; but then I realized that this was impossible because several of the files had been accessed at the same exact time down to the second. So this made me think
    that the only other way the "date accessed" could have changed would have been if someone copied the files.
    Is there any chance at all whatsoever that this is some kind of Windows glitch or something, or is it a fact that someone was indeed accessing my files (and if someone was accessing my files, am I right about the files in question having been copied)? Is
    there any other possibility for what could have happened?
    Do I need to use any kinds of forensics tools to further investigate this matter (and if so, which tools), or is there any other way in which I can be certain of what took place in that timeframe the day before I got back? Or is what I see with Windows 7
    good enough (i.e. accurate and truthful)?
    Thanks in advance, and please let me know if any other details are required on my part.
    P.S. The harddrive is NTFS.

    Never mind.  Someone else already answered this for me:
    "I use last accessed-created date time stamps all the time when troubleshooting-investigating software installs, its been very accurate in my uses of it in NTFS file systems.
    Since some of the dates are while you were gone, I assume it was several days to a week, I would say someone did access those files.
    These timestamps along with other data are used by computer forensics teams to reconstruct what a user did on a computer.
    Experienced Hackers use software to alter these time stamps to cover their tracks when breaking into computer systems."
    http://superuser.com/questions/232143/windows-7-file-properties-is-date-accessed-always-100-accurate/232320#232320
    Additionally, I just now found out what happened. Someone else found the harddrive and thought it was theirs (since it was identical to one they had been missing), so they put it in their computer and scanned it with an anti-virus software. They realized it
    wasn't theirs and then put it back in the place I had hidden it.
    In my original question, I stated a possible theory that someone may have copied the files, but I later realized that copying in itself doesn't affect the "date accessed" of the original/source file, but rather only the "date accessed" of
    the copy.
    Thanks to all those that may have read my question.

  • Biztalk custom send pipeline Using memory stream

    Hi friends,
    I developing the custom send pipeline,I am calling the custom helper class.The helper class I am using the Memory stream.
    The helper class method is looks like:
    public MemoryStream UpdateProcess(MemoryStream ms)
    In the pipeline i need to call this method 
     public IBaseMessage Execute(IPipelineContext pc, IBaseMessage inmsg)
                        //To get Incoming message
                System.IO.Stream originalStream = inmsg.BodyPart.GetOriginalDataStream();
                //Working with XDocument
                XDocument xDoc;
                using (XmlReader reader = XmlReader.Create(originalStream))
                    reader.MoveToContent();
                    xDoc = XDocument.Load(reader);
                // Returning stream
                byte[] output = System.Text.Encoding.ASCII.GetBytes(xDoc.ToString());
                MemoryStream memoryStream = new MemoryStream();
                memoryStream.Write(output, 0, output.Length);
                ProcessHelper mf = new ProcessHelper();
                mf.UpdateProcess(memoryStream);
                memoryStream.Position = 0;
                inmsg.BodyPart.Data = memoryStream;
                return inmsg;
                catch (Exception ex)
                    throw ex;
                finally
                    if (parms != null)
                        parms.Clear();
                return inmsg;
    Can you can any one let me know i am doing correct logic in pipeline component.
    Thanks
    hk
    hk

    Realistically, we can say for sure if it's right since we don't know the implementation of the helper.
    However, since it looks like the helper expects a MemoryStream with Xml content, you can probably just copy the data from the original stream to the local MemoryStream.  Passing it through an XmlDocument might be unnecessary.
    Also, unless you are absolutely sure the source document is ASCII, you shouldn't use ASCII encoding since you can loose meaningful character data.

  • ADF app hangs when accessing through browser

    Hello ADF/Weblogic friends,
    When I try to access the ADF app, it says loading... for ever, which was deployed successfully in WebLogic. I do not see any visible errors in the log. Verified the DB connection which was deployed is working. Also created the datasource for the same db/jndi name in the server level.
    The same app works through JDeveloper(11.1.1.4.0)
    Configuration details are "WebLogic Server Version: 10.3.4.0", "Oracle Enterprise Manager - Fusion Middleware Control 11.1.1.3.0".
    Would be helpful if somebody sheds light on this issue.
    thanks

    Thank you for your reply... sorry for not replying earlier... i had given up hopes : )
    You are right about psadmin binary not being in psconsole.war.. maybe since it is portal server 6, it is some other executable.
    When i asked the person who had given me the setup about psconsole not being available, he said that psconsole was working on his machine. He is not a technical guy and had not installed SJS 6 by himself. So he could not help me with this problem. It could be that the the psconsole.war file was missing from the installation.
    I searched for the war files in both the installed directory as well as in the setup files and could not find it. : (
    Anyway, i un-installed SJS 6 2004Q2 using the setup.exe (in windows) and while uninstalling it showed a window saying "psonsole was not found while accessing through the browser. Retry| Ignore| Exit". Also, the un-installation successfully removed all the services required for starting components like directory server, web server, calender server, (cacao also), etc...
    I've downloaded Portal server 7.2. Let's see how that turns out... Thanks anyways...

  • Not able to access CAS service through Endeca workbench: getting an error "CAS Console must be accessed through Workbench"

    Getting an error while accessing the Datasource tab through Endeca workbench "CAS Console must be accessed through Workbench". Verified both ws-extensions.xml and casconsole.properties files which exists under %ENDECA_TOOLS_CONF%/conf, both files shares the same sharedSecret. Do anybody knows how to resolve this issue.
    I did tried to reinstall CAS component multiple time , all the time the same error throwing.
    Endeca CAS service is also not able to start through microsoft console (services.msc) throwing an error " Windows could not start the Endeca CAS Service service on Local Computer"

    This can occur if CAS is reinstalled. Check your ws-extensions.xml and casconsole.properties files, and ensure that they agree on the same shared-secret.
    Best
    Brett

  • How to refer the Program Data directory in Win 7 through Flex?

    Hi All..!
    I have a requirement, in which I have to store some files in the Program Data directory in Win 7... What I can't find is, how to get the path to this directory through Flex. I was hoping to find something similar to the File.userDirectory.resolvePath method... So far, nothing.. Also, I needed to know what is the equivalent folder of Program Data on Mac... Please help..
    Thanks..!!

    Create a servlet which gets an InputStream of the file and writes it to the OutputStream of the response and call this servlet in `src` attribute. That's basically all.
    Here's an example which covers the most of the minimum needs: [http://balusc.blogspot.com/2007/04/imageservlet.html].

Maybe you are looking for

  • ROUTE NOT GETTING DETERMINED in order

    Dear SAP EXPERTS ! 1. The route is not been able to be determined in the sytem for the Order. I have maintained the Transporation zone in the Customer Master. And Route Determination : Country Dep. Zone and Country Destination Zone Without Weight gro

  • External product catalog integration with SAP CRM

    Hi Forum, I want to integrate a third party Java based product catalog with SAP CRM. Currently, ISA is being used as the front-end.The CRM has a product catalog which is being used. In place of this I want to add an external product catalog. This is

  • Resource execution time..

    Hi gurus, version:11.0.6.0 platform:windows 7 we daily execute around 170-180 sql queries to fetch data from different-different servers and also use spool function in each query to save the output onto csv file but it take a lot time while execution

  • Bluetooth headphones don't work with Macbook pro Retina, but they do with iMac.  Why???

    I AM NOT AN EXPERT.  I AM MAKING ASSUMPTIONS.  My issue, tech spec's and assumptions are listed below. MY ISSUE: My Rocketfish headphones work perfectly with my iMac but not with my Macbook Pro Retina. The Macbook Pro does recognize the device but wh

  • Java 7 Error Message Upon IE Start

    Hello. New to the forums but I wanted to present this to others in case they have a fix. We pushed out the latest Java update to our client base and it seems that nearly ALL Windows 7 machines receive a ssvagent.exe error every time they launch Inter