Recommended Structure for Large Files

I am working at re-familiarizing myself with Oracle development and management so please forgive my ignorance on some of these topics or questions. I will be working with a client who is planning a large-scale database for what is called "Flow Cytometry" data which will be linked to research publications. The actual data files (FCS) and various text, tab-delimited and XML files will all be provided by researchers in a wrapper or zip-container which will be parsed by some as-yet-to-be-developed tools. For the most part, data will consist of a large FCS file containing the actual Flow Cytometry Data, along with various/accompanying text/XML files containing the metadata (experiment details, equipment, reagents etc). What is most important is the metadata which will be used to search for experiments etc. For the most part the actual FCS data files (up to 100-300 mb), will only need to be linked (stored as BLOB's?) to the metadata and their content will be used at a later time for actual analysis.
1: Since the actual FCs files are large, and may not initially be parsed and imported into the DB for later analysis, how best can/should Oracle be configured/partitioned so that a larger/direct attached storage drive/partition can be used for the large files so as not to take up space where the actual running instance of Oracle is installed? We are expecting around 1TB of data files initially
2: Are there any on-line resources which might be of value to such an implementation?

Large files can be stored as BFILE datatypes. The data need not be transferred to Oracle tablespaces and the files will reside in OS.
It is also possible to index bfiles using Oracle text indexing.
http://www.oracle-base.com/articles/9i/FullTextIndexingUsingOracleText9i.php
http://www.stanford.edu/dept/itss/docs/oracle/10g/text.101/b10730/cdatadic.htm
http://www.idevelopment.info/data/Oracle/DBA_tips/Oracle_Text/TEXT_3.shtml
Mohan
http://www.myoracleguide.com/

Similar Messages

  • Thumbnails for large files

    OK..so now I've learned that for my large panorama filesI wont get a thumbnail in the Organizer. Just some Blue Triangle warning with a exclamation point. I do allot of golf course work where I stitch phototos and print 38 x 11 inches @ 300dpi.
    So whats the work around? Is this a problem in the Bid Photoshop program? Sure I can....copy the file...reduce the resolution..save it with a different name..etc.
    What the heck!  You can't tell me that the program writers at Adobe can make this work for large files!  Why the limit?  By the way...what is the limit?
    Thanks

    By the way...what is the limit?
    http://kb2.adobe.com/cps/402/kb402760.html
    Juergen

  • Need PDF Information for large files

    Hi experts,
    I need some information from you regarding PDF related.
    is there any newer adobe version for large files support - over 15mb, over 30 pages.
    is it possible Batch technology with PDF
    if not please suggest me Possible adobe replacements.
    Thanks
    Kishore

    Thanks so long.
    acroread renders the pages very fast. That's what i want - but it's proprietary:/
    For the moment it's ok but i'd like to have a free alternative to acroread that shows the pages as quick as acroread or preloads more than 1 page.
    lucke wrote:Perhaps, just perhaps it'd work better if you copied it to tmpfs and read from there?
    I've tried it: no improvement.
    Edit: tried Sumatra: an improvement compared to Okular etc. Thanks.
    Last edited by oneway (2009-05-12 21:50:10)

  • Faster alternative to cfcontent for large file downloads?

    I am using cfcontent to securely download files so the user
    can not see the path the file is stored in. With small files this
    is fine, but with large 100mb+ files, it is much, much slower than
    a straight html anchor tag. Does anyone have a fast alternative to
    using cfheader/cfcontent for large files?
    Thanks

    You should be able to use Java to handle this, either through
    a custom tag or you might be able to call the Java classes directly
    in Coldfusion. I don't know much about Java, but I found this
    example on the Web and got it to work for uploading files. Here's
    an example of the code. Hope it gives you some direction:

  • SharePoint Foundation 2013 Optimization For Large File Transfer?

    We are considering upgrading from  WSS 3.0 to SharePoint Foundation 2013.
    One of the improvements we want to see after the upgrade is a better user experience when downloading large files.  It can be done now, but it is not reliable.
    Our document library consists of mostly average sized Office documents, but it also includes some audio and video files and software installer package zip files ranging from 100MB to 2GB in size.
    I know we can change the settings to "allow" larger than default file downloads but how do we optimize the server setup to make these large file transfers work as seamlessly as possible? More RAM on the SharePoint Foundation server? Other Windows,
    SharePoint or IIS optimizations?  The files will often be downloaded from the Internet, so we will not have control over the download speed.

    SharePoint is capable of sending large files, it is an HTTP stateless system like any other website in that regard. Given your server is sized appropriately for the amount of concurrent traffic you expect, I don't see any special optimizations required.
    Trevor Seward
    Follow or contact me at...
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.
    I see information like this posted warning against doing it as if large files are going to cause your SharePoint server and SQL to crash. 
    http://blogs.technet.com/b/praveenh/archive/2012/11/16/issues-with-uploading-large-documents-on-document-library-wss-3-0-amp-moss-2007.aspx
    "Though SharePoint is meant to handle files that are up to 2 gigs in size, it is not practically feasible and not recommended as well."
    "Not practically feasible" sounds like a pretty dire warning to stay away from large files.
    I had seen some other links warning that large file in the SharePoint database causes problems with fragmentation and large amounts of wasted space that doesn't go away when files are removed or that the server may run out of memory because downloaded
    files are held in RAM.

  • File Splitting for Large File processing in XI using EOIO QoS.

    Hi
    I am currently working on a scenario to split a large file (700MB) using sender file adapter "Recordset Structure" property (eg; Row, 5000). As the files are split and mapped, they are, appended to a destination file. In an example scenario a file of 700MB comes in (say with 20000 records) the destination file should have 20000 records.
    To ensure no records are missed during the process through XI, EOIO, QoS is used. A trigger record is appended to the incoming file (trigger record structure is the same as the main payload recordset) using UNIX shellscript before it is read by the Sender file adapter.
    XPATH conditions are evaluated in the receiver determination to eighther append the record to the main destination file or create a trigger file with only the trigger record in it.
    Problem that we are faced is that the "Recordset Structure" (eg; Row, 5000) splits in the chunks of 5000 and when the remaining records of the main payload are less than 5000 (say 1300) those remaining 1300 lines get grouped up with the trigger record and written to the trigger file instead of the actual destination file.
    For the sake of this forum I have a listed a sample scenario xml file representing the inbound file with the last record wih duns = "9999" as the trigger record that will be used to mark the end of the file after splitting and appending.
    <?xml version="1.0" encoding="utf-8"?>
    <ns:File xmlns:ns="somenamespace">
    <Data>
         <Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
    </Data>
    </ns:File>
    In the sender file adapter I have for test purpose changed the "Recordset structure" set as "Row,5" for this sample xml inbound file above.
    I have two XPATH expressions in the receiver determination to take the last record set with the Duns = "9999" and send it to the receiver (coominication channel) to create the trigger file.
    In my test case the first 5 records get appended to the correct destination file. But the last two records (6th and 7th record get sent to the receiver channel that is only supposed to take the trigger record (last record with Duns = "9999").
    Destination file: (This is were all the records with "Duns NE "9999") are supposed to get appended)
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
         <R3Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</xtract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
    </R3File>
    Trigger File:
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
              <R3Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
    </R3File>
    I ve tested the XPATH condition in XML Spy and that works fine. My doubts are on the property "Recordset structure" set as "Row,5".
    Any suggestions on this will be very helpful.
    Thanks,
    Mujtaba

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • FTP Sender Adapter with EOIO for Large files

    Hi Experts,
    I am Using XI 3.0 version,
    My scenario is File->XI->FIle, I need to pick the files from FTP Server , there are around 50 files in it each of 10 MB Size.
    So i have to pick the files from FTP folder in the same order as they put into folder , i,e.., FIFO.
    So in Sender FTP Comm channel , i am Specifying
    Qos = EOIO
    queue name =  ACCOUNT
    whether the files will be picked in FIFO into the queue XBTO_ACCOUNT?
      So my question is,
    what is the procedure to specify the parameters so that it will process in FIFO?
      What would be the BEST PRACTICE to acheive it for better performance point of view.
    Thanks
    Sai.

    Hi spantaleoni ,
    I want to process the files using FTP protocol which should be in FIFO ,
    ie.., Files placed first in a folder should be picked up first and remaining one next and so on.
    So if i use FTP ,
    Then to process the files in sequence i.e, FIFO
    the processing parameters
    Qos= EOIO
    Queue name = ACCOUNT
    would process the files in FIFO.
    and to process large files (10MB size) what would be the best  Polling interval secs.
    Thanks,
    Sai

  • Seeburger As2 Receiver channel exception for large files

    Hello Folks,
    We have JMS to seeburger AS2 interface we are facing follwing issue in As2 reciver channel for files larger than 20 MB.But In Production it is wokring fine for even 40 MB also.
    Delivering the message to the application usingconnection AS2_http://seeburger.com/xi failed, due to:
    com.sap.engine.interfaces.messaging.api.exception.MessagingException:javax.resource.ResourceException: Fatal exception: javax.resource.ResourceException:SEEBURGER AS2: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. # , SEEBURGER AS2:org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosingrequest can not be repeated. # .
    Please through some light on the issue .
    Regards
    Praveen Reddy

    Hi Praveen,
    The problem would be related to server size. genrally test system do not have same sizing as production server due to that you can not process large files in test system.
    check the sizing of the system.
    regards,
    Harish

  • Are the brushes in Photoshop CC faster than CS6 - still need to use CS5 for large files

    Hey,
    Are the brushes in Photoshop CC any faster than Photoshop CS6.
    Here's my standard large file, which makes the CS6 brushes crawl:
    iPad 3 size - 2048 x 1536
    About 20-100 layers
    A combination of vector and bitmap layers
    Many of the layers use layer styles
    On a file like this there is a hesitation to every brush stroke in CS6. Even a basic round brush has the same hesitation, it doesn't have to be a brush as elaborate as a mixer brush.
    This hesitation happens on both the mac and pc, on systems with 16 gb of ram. Many of my coworkers have the same issue.
    So, for a complicated file, such as a map with many parts, I ask my coworkers to please work in CS5. If they work in CS6 I ask them to not use any CS6 only features, such as group layer styles. The only reason why one of them might want to use CS6 is because they're working on only a small portion of the map, such as a building. The rest of the layers are flattened in their file.
    Just wondering if there has ever been a resolution to this problem...or this is just the way it is.
    Thanks for your help!

    BOILERPLATE TEXT:
    Note that this is boilerplate text.
    If you give complete and detailed information about your setup and the issue at hand,
    such as your platform (Mac or Win),
    exact versions of your OS, of Photoshop (not just "CS6", but something like CS6v.13.0.6) and of Bridge,
    your settings in Photoshop > Preference > Performance
    the type of file you were working on,
    machine specs, such as total installed RAM, scratch file HDs, total available HD space, video card specs, including total VRAM installed,
    what troubleshooting steps you have taken so far,
    what error message(s) you receive,
    if having issues opening raw files also the exact camera make and model that generated them,
    if you're having printing issues, indicate the exact make and model of your printer, paper size, image dimensions in pixels (so many pixels wide by so many pixels high). if going through a RIP, specify that too.
    etc.,
    someone may be able to help you (not necessarily this poster, who is not a Windows user).
    a screen shot of your settings or of the image could be very helpful too.
    Please read this FAQ for advice on how to ask your questions correctly for quicker and better answers:
    http://forums.adobe.com/thread/419981?tstart=0
    Thanks!

  • Export Media from DV Pal to AVI: Best recommended setting for Output file?

    I have a project in CS4, on Windows, in DV format (original videos were M2t).
    What is the optimal setting for export (and possibly for the sequence settings too) for the best quality and then smallest size?
    If my input is M2T files - is the best settings to work in Premiere is DV 1 for Pal?
    I always use Progressive Interlace - is that required or is the default of Lower field OK as well (I am not sure exactly what is the impact of using just one field, but from previous versions of Premiere Progressive always improved the video image)?
    Is the frame rate of Pal as 25 FPS must be adhered in the sequence setting and export to avoid re-rendering?
    As for the Export side:
    Is using the DV Pal is best for quality without going uncompressed?
    Is starting with 16GB of DV as m2T and ending with 20GB of DV reasonable?
    Is the method of exporting to DV and then compressing it using a 3rd party is the best way going about the creation of the movie and then compressing it for distribution?
    What is the the recommended tool for compressing - and with which format?
    The end result is meant to be consumed as an AVI on people's desktop. Not on mobiles or YouTube or DVD.
    On BitTorrent I see many files that are in HD but have an amazingly small size of less 1 GB - what do these guys use to achieve such small sizes yet amazing quality?
    What is the best tool and codec to achieve a compression based on a file size?
    Is DivX better than XVid? And is "Multipass nth pass" is the way to configure the DivX for best results?
    I really feel we lack any guidance on these aspects and a huge of time is wasted every time I get to this stage when it should all be pretty standrad for me.
    I know I am asking a number of question but if someone is able to jot down the basics of the tool, codec and settings to achieve best quality (regardless of the time it takes to get to it) - I'd be very grateful!
    Merry Xmas and a Happy New year to all,
    Eroka00

    I find your post somewhat confusing.
    These m2t are they converted?
    Run a file that you are using in Prmeiere through Media Info and post a screendump.
    http://mediainfo.sourceforge.net/nl
    What is the endproduct going to be? YouTube, DVD, media player or .....

  • Please advise a novice computer artist on system requirements for large files in PS ?

    I am researching my next computer.
    I am an artist currently using CS2 to rework my paintings into prints.
    http://s719.photobucket.com/albums/ww198/blueridernz/Rachel%20Thompson%20Portfolio/?albumv iew=slideshow
    The paintings have been scanned at a high resolution and are 500MB.
    The image files I work with are large but the process simple, I am extracting and collaging.
    Nothing too technical here.
    As I will be printing large, 9' x 4', the end result file could be 2GB which I understand is the limit for PSD files.
    I am not gaming or editing video files, just working with large images.
    From what I have read I believe I need
    i7 Core
    12 GB RAM set of 3
    GeForce GTX 460
    Any advice truly appreciated,
    Rachel.

    The graphics card is perhaps a bit of overkill, if you really just paint and touchup in PS. A less power hungry and cheaper model would do just fine. The rest should do just fine, but keep in mind that all this is not going to do much good if you don't get a 64bit operating system and possibly also upgrade to CS5 to make use of all these resources. You might not even be able to install CS2 under Windows 7...
    Mylenium

  • Small NCP packets for large file read

    I am running the lastest 4.91sp4 client and am getting really slow
    performance through a application called bimap which reads and processes
    large map files. When these files are on a Netware server (OES or 6.5 NSS
    volume), Ethereal shows that NCP read requests are for only 512bytes and
    then an NCP read OK is returned with 512bytes worth of data. This takes
    the program several minutes to read the file and process. When the same
    files sits on a Micrsoft Server Ethereal shows bursts of TCP packets all
    with 1500bytes of data, resulting in a much faster load.
    LIP is turned on and seems to work OK when copying files around NetWare
    server, I see the bursts of 1500bytes of data, but doesn't seem to work
    when using this bimap application. Any ideas?

    Hi,
    Sasha wrote:
    >
    > Their not on the same segments, the routers are all correct according to
    > our Networks team. Ethereal definateley shows the NCP read request as
    > 512Bytes. Other applications such as Word opening large files are OK from
    > same PC, Ethereal shows NCP request of 4096. Once again using the Bimap
    > app and reading the same files stored on a Microsoft Server works OK as
    > well. My understanding with LIP is that the large packet size will try and
    > be negotiated, if it fails it reverts to it's default of 512. I assume it
    > is failing and I don't know why?
    It sounds like the application is specifically reading 512 byte
    fragments on purpose. The fact you see bigger reads on Windows is likely
    because on windows oplocking (caching) is enabled, but isn't on Netware.
    CU,
    Massimo Rosen
    Novell Product Support Forum Sysop
    No emails please!
    http://www.cfc-it.de

  • Does Airport time capsule work well as a storage for large files, such as RAW Photos and HD Video?

    If Airport time capsule can work as a storage device, how reliable is it?  Are downloads into Airport dependent on internet speed, or on the time capsule wifi?

    If Airport time capsule can work as a storage device, how reliable is it?
    The Time Capsule was designed to be used for Time Machine backups, but can equally be used for storage. Like any network storage device it is relatively reliable. However, you should always have a good backup strategy for important files.
    Are downloads into Airport dependent on internet speed, or on the time capsule wifi?
    Data transfer rates "into" the Time Capsule are dependent on a number of factors. Which factors depend on the original source of the files. Regardless, the common factors will be the TC's internal hard drive interface and data write speeds of the drive itself.

  • Which format on a flash drive for large files for use by Mac and PC

    I need to copy large files (9GB of movie file exported from iMovie of school graduation) onto 16GB flash drives so they can be used by school parents who may have Mac, PC or even TV.
    My first attempt says the files are too large.
    A lot of googling tells me this is a recognised problem but I am confursed by which advice to take.
    I am on a 2012 iMac running OSX version 10.7.4.
    Do I need to download some software to run NTFS.  It all sounds so confusing!
    I ended up in this predicament because the quality of my photo slideshows I copied to my DVD-R was so bad.  Thought a flash drive would be easy. Ha, not at all.
    Please answer in laymans terms - I could barely follow some of the advice I found.....

    Format the flash drives with Disk Utility in ExFAT format.

  • IPlanet 6.0 SP4 crashes for large file uploads

    Environment:
    iPlanet 6.0 SP4,iPlanet App Server 6.5, with maintenance update 3 installed running on the same machine on Solaris 8.0 (Sparc)
    When I try to upload large file( > 10 MB) using HTTP POST MULTIPART/FORM-DATA the web server crashes with the following errors.
    [18/Oct/2002:16:52:02] catastrophe ( 600): Server crash detected (signal SIGSEGV) [18/Oct/2002:16:52:02] info ( 600): Crash occurred in function memmove from module /export/home/iplanet/web60sp4/bin/https/lib/liblibdbm.so [18/Oct/2002:16:52:02] failure ( 376): Child process admin thread is shutting down [18/Oct/2002:16:52:05] warning ( 624): On group ls2_default, servername oberon does not match subject "192.168.0.35" of certificate Server-Cert. [18/Oct/2002:16:52:05] info ( 624): Installing a new configuration [18/Oct/2002:16:52:05] info ( 624): [LS ls1] http://oberon, port 80 ready to accept requests [18/Oct/2002:16:52:05] info ( 624): [LS ls2] https://oberon, port 443 ready to accept requests [18/Oct/2002:16:52:05] info ( 624): A new configuration was successfully installed [18/Oct/2002:16:52:05] info ( 624): log.cpp:openPluginLog() reports: Environment variable IAS_PLUGIN_LOG_FILE is not set. All plugin messages will be logged in the web server log file
    [21/Oct/2002:10:40:02] catastrophe ( 1210): Server crash detected (signal SIGSEGV) [21/Oct/2002:10:40:02] info ( 1210): Crash occurred in NSAPI SAF gxrequest [21/Oct/2002:10:40:02] info ( 1210): Crash occurred in function __1cIGXBufferLAllocMemMap6ML_L_ from module /export/home/iplanet/app65mu3/ias/gxlib/libgxnsapi6.so [21/Oct/2002:10:40:02] failure ( 715): Child process admin thread is shutting down [21/Oct/2002:10:40:05] warning ( 1230): On group ls2_default, servername oberon does not match subject "192.168.0.35" of certificate Server-Cert. [21/Oct/2002:10:40:05] info ( 1230): Installing a new configuration [21/Oct/2002:10:40:05] info ( 1230): [LS ls1] http://oberon, port 80 ready to accept requests [21/Oct/2002:10:40:05] info ( 1230): [LS ls2] https://oberon, port 443 ready to accept requests [21/Oct/2002:10:40:05] info ( 1230): A new configuration was successfully installed
    Do I need to set anything in the web server or app server? Any help is appreciated.

    Be sure to rollback to 16.0.1; the initial v16 update had serious security flaw.

Maybe you are looking for

  • How do I get personal movies onto my iPad without using iTunes

    Hi, I have several "movies" - like ones my kids made using iMovie on their their ipods, clips of my kids playing their sports, and other non-tradional films that i would like to keep and add or subtract to my liking from my iPad (3rd Gen). In iTunes

  • I cannot transfer music from my iPod to my computer.

    The hard drive on my computer crashed/  I didn't have a backup, so no big deal.   I installed iTunes and went to copy my music from my iPod back, but it won't copy.  Why not?  I've read the answers about protecting music rights and the iPod is not a

  • Win 7/Office 2010 hangs periodically - Network Problem?

    Hi, Our situation: We have one user (out of 60) at our company whose computer periodically goes into "spin cycle" and she must reboot it because it won't respond.  This happens a couple of times a week.  Generally, she always has Outlook 2010 open, a

  • Using loop varibale in UPDATE statement

    begin for i in 1..31 loop Update od_ebh_shift_schedule Set i='Test' where year=2010 and i='W' ; end loop; end;} Here "i" is the column such as 01,02,03....31 in the table called od_ebh_shift_schedule FYI-->Column type is varchar2 I want the column ge

  • Outbound Client ABAP Proxy Timeout Error

    Hello everybody we have the next scenario: R/3 Client Synchronous ABAP Proxy->XI-> Synchronous RFC and we are having problems cause the Sender R/3 is not waiting for the RFC in the Receiver system to process so XI is in log  status fot this messages,