Solution for head quarter database

We have distributed architecture with around 50 database in different locations.
To generate MIS at one server We are doing the following.
1.Take export of selected tables from different locations.
2.Truncate base tables every time before import.
3.Through one procedure migrating data from base tables to query tables.
For ex
1.Export of table emp from site
2.At Central server
create table emp_hq same as emp.
truncate table emp
import of emp table
migration of data from emp to emp_hq
Finally generating MIS from emp_hq table.
My question is wheather above is the right approach and how I can improve the
process?
Problem Areas
import is taking lots of time.

Actually we don'nt have the network connectivity Without network connectivity between the database, your options might be limited to what you are currently doing to move the data.

Similar Messages

  • Looking for the best solution for downgrading a database from 11.2 to 11.1

    we have an install of 11.2 grid and have made several attempts at adding 11.1 binaries and using the 11.2 infrastructure. we patched, etc... but all of our attempts failed in getting 11.1 to work across the cluster.
    we have decided to make all of the installs version 11.1.
    I am looking for the most straight forward approach to migrate our 11.2 down to 11.1 with the least amount of downtime.
    thanks, tom

    Hi Tom;
    Please check below note
    Master Note For Oracle Database Downgrade [ID 1151427.1]
    How To Downgrade From Database 11.2 To Previous Release [ID 883335.1]
    I belive it will answer all your question.
    Please also check below links:
    http://blogs.oracle.com/db/2010/09/master_note_for_oracle_database_downgrade_doc_id_11514271.html
    Regard
    Helios

  • What are solutions for a way-too-big database?

    Hi guys!
    I'm a software developer and not very good in database designing. One day, I was asked something like this :
    "For example, there is a company with a web application. One day, the database for that application is way too big, caused performance issues and others, what is the solution for that application database?"
    At first, I thought that was about using multiple database with single app. But I don't know if I was right.
    I want to ask that what are the solutions? If it's "multiple database" then what should I do? Using two connection to 2 database simutaneously?
    I appreciate any replies. Thanks!

    847617 wrote:
    Thanks Lubiez Jean-Val... for your links.
    I've got some more advices like :
    - "transferring workload to another database using different techniques to copy the data from original db"
    - "redesign of the database"
    So that means we use 2 different databases?Sometimes it is deemed desirable to keep only fairly recent data on the OLTP database, where the normal transaction activity happens, and replicate the data to another database that also contains historical data. This second database is used for heavy reporting tasks.
    And "redesign"?As in, design it from scratch and do it right this time. Make sure all data relations are properly defined to Third Normal Form; make sure all data is typed properly (use DATE columns for dates, NUMBER columns for numbers, etc); make sure you have designed effective indexing; make sure you use the capabilities of the rdbms and do NOT just use it as a data dump.
    See http://www.amazon.com/Effective-Oracle-Design-Osborne-ORACLE/dp/0072230657/ref=sr_1_3?s=books&ie=UTF8&qid=1301257486&sr=1-3
    are they really good solutions?Like most everything else, "It depends"
    It depends on if the proposed solutions are implemented properly and address the root problem. The root problem (or even perceived problem) hasn't yet been defined. You've just assumed that at some undefined point the database becomes "way-too-big" and will cause some sort of problem.
    It's assumed that we don't have or can't use partitioning.
    And why is that assumed? Yes, you have to have a version of Oracle that supports it, and it is an extra cost license. But like everything else, you and your management have to do a hard-nosed cost/benefit analysis. You may think you can't afford the cost of implementing partitioning, but it may be that you can't afford the expenses derived from NOT implementing it. I don't know what the case is for you, but you and your management should consider the factors instead of just rejecting in out of hand.
    :):)...You are making me - a student so excited about the history. From slides rule to the moon....
    Edited by: 847617 on Mar 27, 2011 10:01 AMEdited by: EdStevensTN on Mar 27, 2011 3:24 PM

  • SQL Server 2012 - Wat Is The Best Solution For Creating a Read Only Replicated/AlwaysOn Database

    Hi there I was wondering if someone may have a best recommendation for the following requirement I have with regards setting up a third database server for reporting?
    Current Setup
    SQL Server 2012 Enterprise setup at two sites (Site A & Site B).
    Configured to use AlwaysOn Availability groups for HA and DR.
    Installed on Windows 2012 Servers.
    This is all working and failover works fine and no issues. So…
    Requirement
    A third server needs to be added for the purpose of reporting, to be located on another site (Site C) possibly in another domain. This server needs to have a replicated read only copy of the live database from site A or Site B, whichever is in use. The Site
    C reporting database should be as up-to-date to the Site A or Site B database as possible – preferably within a few seconds anyway….
    Solution - What I believe are available to me
    I believe I can use AlwaysOn and create a ReadOnly replica for the Site C. If so do I assume Site C needs to have the Enterprise version of SQL server i.e. to match Site A & Site B?
    Using log shipping which if I am correct means the Site C does not need to be an Enterprise version.
    Any help on the best solution for this would be greatly appreciated.
    Thanks, Steve

    for always on - all nodes should be part of one windows cluster..if there site C is on different domain - I do not think it works.
    Logshipping works --as long as the sql on site C is is same or higher version(sql 2012 or above).  you can only do read only.
    IMHo, if you can make site C in the same domain then, Always is better solution else log shipping
    also, if your database has enterprise level features such as - partitonin, data compression -- you cannot restore the database on lower editions- so you need to have enterprise edition.
    Hope it Helps!!

  • Seeking a solution for synchronization of sequences in expdp

    Hi all,
    When i am exporting tablespaces from a live database, first the tables are being exported and then the sequences. and while these sequences are being exported, as this is the live database, the sequences are getting modified because of new insertions. So i am getting a conflict with sequences not being exported correctly.
    I am using Linux OS with oracle version 10.2.0.4
    Please provide me a solution for this.
    Thanks & Regards,
    Ravi kumar

    Sequences get exported before the table data. I find that kind of annoying too that the sequences can be behind the data after an import. To get around this problem I run the following to create a script that will regen the sequences. Do this any time after the export is complete and you will have no problem.
    set linesize 300
    set heading off
    set trimspool on
    set trimout on
    set pagesize 0
    spool regen_seq.sql
    Select 'DROP SEQUENCE '       || sequence_owner ||'.' || sequence_name || ';' || chr(10)
                                  ||
           'CREATE SEQUENCE '             || sequence_owner ||'.' || sequence_name ||
           ' MINVALUE '                   || min_value     ||
           ' MAXVALUE '                   || max_value     || chr(10) || '      ' ||
           ' INCREMENT BY '               || increment_by  ||
    --       ' START WITH 1'                 ||
           ' START WITH '                 || last_number   ||
           ' CACHE '                      || DECODE(cache_size, 0, 20, 1, 20, cache_size)    ||
    --       ' CACHE '                      || cache_size    ||
           ' '                            || DECODE(order_flag,'N','NOORDER', 'ORDER')  ||
           ' '                            || DECODE(cycle_flag,'N','NOCYCLE', 'CYCLE')  ||
           ';'                            || chr(10) || chr(10) || chr(10)
      From dba_sequences
    Where sequence_owner = '<my schema>'
    spool off

  • Solution for scanning multiple pages from inside the form6i/10g

    Hi All,
    I need to scan multiple pages from Oracle forms (6i/10g) using a scanner which is scanning multiple pages per minute. Can any one post a solution for me.
    Hafeez

    here are the two ways I use to scan images.
    1) if the image is a single TIF image then d2kwutil can upload the image into a BLOB field on the database (but it is limited to only SINGLE-page TIFs).
    2) if the image is a single or multi-page PDF, then your forms client must copy the image to a filesystem location known by the database (where the filesystem location is a directory on the database), and then the database can import the image into a BLOB field within the database.
    either solutions work pretty fast for me (<10 seconds per image)
    Chris

  • What's the best storage solution for a large iLife? RAID? NAS?

    I'm looking for an affordable RAID storage solution for my Time Machine, iTunes Library, iMovie videos, and iPhoto Library. To this point I've been doing a hodgepodge of external hard drives without the saftey of redundancy and I've finaly been bitten with HD failures. So I'm trying to determine what would be the best recommendation for my scenario. Small Home Office for my wife's business (just her), and me with all our media. I currentlty have a mid-2010 Mac Mini (no Thunderbolt), she has an aging 2007 iMac and 2006 MacBook Pro (funny that they're all about the same benchmark speed). We have an AppleTV (original), iPad2 and two iPhone 4S's.
    1st Question: Is it better to get a RAID and connect it to my Airport Extreme Base Station USB port as a shared disk? OR to connect it directly to my Mac Mini and share through Home Sharing? OR Should I go with a NAS RAID?
    2nd Question: Simple is Better. Should I go with a Mac Mini Server and connect drives to it? (convert my Mac Mini into a server) or Should I just get one of those nice all-in-one 4-bay RAID drive solutions that I can expand with?
    Requirements:
    1. Expandable and Upgradeable. I don't want something limited to 2TB drives, but as drives get bigger and cheaper I want to easily throw one in w/o concerns.
    2. Simple integration with Time Machine and my iLife: iTunes, iMovie, iPhoto. If iTune's Home Sharing feature is currently the best way of using my media across multiple devices then why mess with it? I see "DLNA certified" storage on some devices and wonder if that would just add another layer of complexity I don't need. One more piece to make compatible.
    3. Inexpensive. I totally believe in the "You Get What You Pay For" concept. But I also realize sometimes I'm buying marketing, not product. I imagine that to start, I'm going to want a diskless system (because of $$$) to throw all my drives into, and then upgrade bigger drives as my data and funds grow.
    4. Security. I don't know if its practical, but I like the idea of being able to pop two drives out and put them in my safe and then pop them back in once a week for the backup/mirroring. I like this idea because I'm concerned that onsite backup is not always the safest. Unfortunately those cloud based services aren't designed for Terabytes of raw family video, or an entire media library that isn't wholey from the iTunes Store. I can't be the only one facing this challenge. Surely there's an affordable way to keep a safe backup for the average Joe. But what is it?
    5. Not WD. I've had bad experiences with Western Digital drives, and I loathe their consumer packaged backup software that comes preloaded on their external drives. They are what I meant when I say you get what you pay for. Prettily packed garbage.
    6. Relatively Fast. I have put all my media on an external drive before (back when it fit on one drive) and there's noticeable spool-up hang time. Thunderbolt's nice and all, but so new that its not easily available across devices, nor is it cheap. eSata is not really an option. I love Firewire but I'm getting the feeling that Apple has made it the red-headed step-child of connections. USB 3.0 looks decent, but like eSata, Apple doesn't recognize it exists. Where does that leave us? Considering this dilemma I really liked Seagate's GoFlex external drives because it meant I could always buy a new base and still be compatible. But that only works with single drives. And as impressive as Seagate is, we can't expect them to consistently double drive sizes every two years like they have been -cool as that may be.
    So help me out without getting too technical. What's the best setup? Is it Drobo? Thecus? ReadyNAS? Seagate's BlackArmor? Or something else entirely?
    All comments are appreciated. Thanks in advance.

    I am currently using WD 2TB Thunderbolt hard drive for my iTunes, which i love and is works great.  i am connected directly to my Mac Book Pro. I am running low on Memory and thinking of buying a bigger Hard drive.  My question is should I buy 6TB thunderbolt HD or 6TB NAS drive to work solely for iTunes.  I have home sharing enabled for my Apple TV 
    I also have my time capsule connected just as back up only.   

  • Solution for intermittent use of a single Adobe product?

    I may be a unique case, because I do not want or really need all of the products that Adobe provides my for $30/month. Here is my dilemma:
    I use Adobe illustrator infrequently, sometimes intensively for a month or so, but then I will not use it for the better part of a year. This is how I use the application.
    So, I'm feel that being financially penalized for continuing to use Illustrator, since I pay for it every single month of my life, but use it infrequently. For example, if I use the program 20 times in 10 years (which is about how I do use it), then I am paying (according to the current student deal which I am sure will change in the coming 10 years time), $3600.00 over the course of 10 years to use the program 20 times. That works out to $180.00 per use.
    Obviously I am not in good conscience able to renew my subscription with Adobe, and would have ended it earlier were it not for the severe financial penalty Adobe imposes on users for breaking a subscription contract before its official expiration.
    Before I go, I just wonder if Adobe has any care for users such as myself. I know one unsatisfactory response already: I can by the full version, not get incremental updates, and will need to pay full price for each major upgrade. That's a bad deal too. How can a infrequent user of a single Adobe product (Illustrator) use the application for a reasonable amount of money? I don't need every Adobe product under the sun nor do I like the fact that somewhere in Adobe's hierarchy they think that is some kind of advantage to users. To me, that is product bloat, bordering on spam marketing.If I want a sip of water, I have to drink from the fire hydrant. This is so bone-headed.
    I suppose, overall, I will advise my students to use what software the company they work for uses. But as far as a personal graphics package for independent work, I cannot recommend Adobe products anymore because of this pricing scheme, which really does penalize people who only use a product one in a while. But before I go, am I missing something? Is my assessment of this dilemma on target? I know I can no longer afford Adobe products, but I also want to advise my students in a way that is fair to them. Currently I am working on a module that shows all of the vector mapping graphics programs and breaks them down according to their feature set and cost for the user.
    I've been using Adobe Illustrator for over two decades, and I will be sad to not have it available to me in the future, but this new business model and pricing structure no longer makes sense for me. Yes, considering the time and energy I've put into the program this is an emotional decision, but ending my relationship with Adobe is the healthy choice. It's like walking away from an abusive relationship of the passive-agressive variety. Adobe, what a drag!

    On the face of it, the $30 for every monthly use seems like it might be a good solution for intermittent users. But, as a designers, I think many of us might like to take a brief look at an idea (sometimes just for the fun of it.) And so, if during one month I want to take a peek at some of my work with no real intention of doing any work, I would need to pay $30 to look at my own work. This kind of situation creates a kind of "double bind" for me. It's an uneasy feeling, having a kind of oppressive effect on the creative flow of ideas. Consider this case scenario: "I think I can improve that logo I did last summer. I wonder what this change would look like?" Well, to find out that will be $30.00. For me, that's an uncomfortable position. Again, thanks for the thoughtful reply.

  • Service accounts for the Workspace Database service permission Error while creating Tabular Mode from PowerPivot

    Hi All,
    Please help me out against this issue. I have spent so much (3 working days) time just figuring out what is the issue and its solution.
    I am learning Tabular Mode and trying to create a mode based on PowerPivot model. I am getting following error message:
    'The PowerPivot workbook could not be imported. The service account for the workspace database server does not have permission to read from the PowerPivot workbook.'
    Here is my infrastructure:
    1. SSAS in Tabular Mode is installed on my Windows 8 Laptop
    2. PowerPivot is also in my laptop
    3. There is only my account (as Admin of course) for SSAS
    Here are my questions:
    1. What is this error and how can I cope with that? A step by step explanation would be highly appreciated :-)
    2. Do I need to change something in Windows settings or in SSAS?
    3. I am confused about my workspace database server as well, Do I have to install SSAS twice; one for development and one for workspace?
     Looking forward for the expert advise.
    Tahir
    Thanks, TA

    Hi,
    I suspect you might have more luck if you try the SSAS forum: http://social.msdn.microsoft.com/Forums/sqlserver/en-US/home?forum=sqlanalysisservices
    Regards
    Jamie
    ObjectStorageHelper<T> – A WinRT utility for Windows 8 |
    http://sqlblog.com/blogs/jamie_thomson/ |
    @jamiet |
    About me

  • Is RoboHelp Server the solution for us?

    Hi all,
    I'm trying to figure out whether RH Server is the solution for us. I've seen a few similar posts in the forums, but none that really answer my specific questions. I'm hoping for feedback from someone with experience in RH Server, who is perhaps in a similar situation to us.
    We are a software development company, creating desktop software for PCs. We have three main applications, and approximately 20,000 users. I am the company's (only) Tech Writer.
    Currently I provide documentation (user guides) to our clients in the form of the good ol' CHM file. I develop these in RH8. Each of our applications is bundled with an installed CHM file which users access by pressing F1. I'm sure you're all familiar with this process. Our users run our software on their own systems, completely independent of us - there's no connection to us whatsoever. However, they all have Internet access, as it's a requirement for them to be able to use certain functionality within our applications. Hold that thought.
    I also write Knowledge Base Articles in MS Word, as PDFs, which users access via our web site. Internal to our organisation, I provide KBAs and other such material for our Technical Support staff. These aren't available/accessible to the public. I write these in Word because they need to be sent around to a team of reviewers from time-to-time, and whilstI'm the only person with RH installed, everyone has Word.
    What I'm hoping to achieve with RH Server is this:
    Do away with the old CHM files that our users have. They're just awful. I'm hoping that in future, when our users press F1 from within our applications, they will be taken to a corresponding page on a web site or AIR page similar to what the Adobe Help looks like when we access it from RoboHelp. Forgive my ignorance if I've used incorrect terminology. That way they'll always be reading the latest Help content, live, online, instead of what I wrote last time we sent a product update out. I imagine our programmers will have to edit the functionality behind the action of our users pressing F1 in our applications. Has anyone done this? Are there any issues we should be aware of? What happens if our users don't currently have Internet access, for example - is it possible to call a local version of the Help instead, if this is detected? Has anyone done this with a similar number of users to us? We could well have 100s of users trying to access the system simultaneously.
    Gather feedback from users. From what I understand, we'll be able to use RH Server to see which Help topic/content is being viewed, and also receive feedback from users. Can anyone give me feedback on their experiences with this? Any tips/hints/issues I need to be aware of? Is it possible for us to determine which users accessed which content? Remember, the idea is that users should be able to access this content by pressing F1 - I don't want them to have to sign in every time they need to access our Help system - it should be seamless to them. So, I'm wondering how it would be possible to track user usage without making them sign in. This is important to us because some of our content is region-specific, and it would be handy to know if users from those regions are actually accessing the Help content that relates to them.
    Host our internal documentation on the same server as our public documentation. Is it possible to host all of our internal, private documentation on the same RH Server, making it available to our Tech Support team (and other internal teams) only? I imagine we could do this by password-protecting it, but I want to ensure that the public don't even know it exists. ...and our Tech Support people would not be impressed with having to sign in every time they wanted to access their Knowledge Base. Any tips here regarding locking down / restricting access to content?
    A quick note about collaboration:
    Currently, although the Help menus are developed in RH, the KBAs and other PDF documents are written in MS Word. I send them around to a team of reviewers who add their comments/edits and send them back to me. From what I understand, this is something I can do with RH10 - export PDFs and send them around for review, combining the results later, at which time I give them a final review before publishing. Have I understood this correctly? Does RH Server play a part in this process? Can I use RH Server's feedback capabilities as a mechanism for my review team to make edits/comments? I guess I'm trying to get an understanding of how sophisticated the RH Server feedback system is. If I can use RH Server to have the team read/review documents, it'll save me having to manage a bunch of Word documents that I email them. It'll also minimise the chance that a redundant document gets distributed by mistake - something that can occur because people use their locally-saved documents I emailed them earlier, instead of the finals.
    Thank you.

    Hi, symmetricalMan
    Let's see if I can tackle some of these "inline". There are a lot of moving parts to your system (you're a busy guy!)
    I only have time for a few of these. Perhaps Colum McAndrew and others will chime it with their experiences.
    >>they will be taken to a corresponding page on a web site or AIR page similar to what the Adobe Help looks like when we access it from RoboHelp.
    It would be WebHelp Pro in the scenario you mention (not AIR Help).
    By "taken to a corresponding page" you are referring to Context Sensitive Help which RoboHelp Server does support (including your F1 scenario).
    >>What happens if our users don't currently have Internet access, for example - is it possible to call a local version of the Help instead, if this is detected?
    Hmm. You could either package a plain WebHelp (not Pro) output and distribute for access on a share drive. The detection thing would be up to your developers. Come to think of it, AIR Help does have a potential alternative here which might be worth looking into. Obviously, there would be two systems to maintain. I'm not up to date on it, but you'll find info here in the online help: http://help.adobe.com/en_US/robohelp/robohtml/WS81F63111-6ACF-4a02-B2B2-461FEBFA8093.html
    >>.we'll be able to use RH Server to see which Help topic/content is being viewed, and also receive feedback from users.
    Actually, the "feedback" is anonymous (no names are collected.) You can however, create "Areas" and analyze the traffic on topics according to sub-sets of your users.
    The feedback is not direct from the users. In other words, RoboHelp Server (at least for now) does not support Commenting (as AIR Help does). So Feedback Reports are derived from the end-users "surfing" your site and collecting their search terms verbatim to get an idea of what they are searching for in order to improve your content.
    >>So, I'm wondering how it would be possible to track user usage without making them sign in.
    RH Server Sites do not have to be "Protected" by authentication. It is your choice. You can have some sites (called Areas) that are authenticated and some sites that are not authenticated, all on the same RH Server. RoboHelp Server uses a database and can authenticate users (by setting up protected "Areas".) However, my networking knowledge is limited and you would have to ask someone else about "persistent logins" etc.
    >>content is region-specific, and it would be handy to know if users from those regions are actually accessing the Help content that relates to them.
    Yes, you can do this. This is where RoboHelp Server can be used to create "Areas" for different content to be delivered to different audiences.
    >>Is it possible to host all of our internal, private documentation on the same RH Server,
    Yes you can. However the Tech Support sign in scenario question would have to be answered by someone else. It's hard to know from where I sit.
    >>RH10 - export PDFs and send them around for review, combining the results later, at which time I give them a final review before publishing
    Yes, this workflow would seem to work for you. However, RoboHelp Server plays no role in this review one way or the other. There are many alternatives for sharing the PDF which is described in the documentation.
    See #6 on this page:
    http://help.adobe.com/en_US/robohelp/robohtml/WS1b49059a33f77726-2db1c75912bc47baaf8-7ffb. html
    You should also download the Adobe RoboHelp Server Reviewer's Guide which also has videos embedded.
    http://www.adobe.com/support/documentation/en/robohelp/9/AdobeRoboHelpServer9_ReviewersGui de.pdf
    Hope this helps
    John Daigle
    Adobe Certified RoboHelp and Captivate Instructor
    Evergreen, Colorado
    www.showmethedemo.com
    Twitter: @hypertexas

  • Make your own Fax Server with Automator! (Pagesender solution for Mavericks)

    I have been scouring these discussion boards for some time now looking for a suitable substitute to PageSender, an awesome fax solution for the Mac from SmileOnMyMac LLC, which for some inexplicable reason stopped development and updates after OS 10.6.8. The result is that many small business office users who still rely on fax (and yes...no matter what they tell you, most of the business world DOES still use fax because it's legally binding and more secure than email for the transmission of legal documents or healthcare records, and does not rely on database integration accross different systems, which is sadly but very realistically still a long ways off), and no longer have a way to integrate faxes into a paperless or digital workflow office system.
    I suspect like many folks who receive faxes, those who used PageSender, used a very powerful feature to forward faxes by email, thereby turning your Mac into a Fax server that could distribute your faxes to other workstations and staff throughout the business via email. Presumably, if you have your own email server (Exchange, Kerio, AppleMail server, PostFix enabler etc.) you could distribute faxes on your own internal network, securely behind a firewall, and effectively create a digitial/paperless workflow for your faxes.
    Even if you have a USB modem or multifunction printer that allows you to recieve a Fax to your desktop (Apple's internal fax via printer preferences, and some HP models like the HP MFP 127fw) for example will allow you to recieve a Fax to a desktop folder or forward to a single email address. But the former is of limited functionaliy and the later only lets you send to an email address out over the internet with a registered public domain, which means you give up all control of privacy and means you can't process it through a private mail server to create a digital workflow for your office...
    ...Until now!!!
    I am happy to report that I have finally discovered a very easy and useable feature that will save a lot of time, money, and headaches for those looking to create a digital workflow and fax server system for a small office system. (I don't think there is any limit to scale here, but I suspect offices with more than 10 employees probably have a BizHub, or HP MFP/digital sender that can create the same process directly from the printer, but of course these come with a price tag of $2000 and up...).
    To accomplish this however, you will need some basic requirements which are as follows:
    1) A USB modem from either US Robotics or Zoom Modem. These are readily available from Amazon, MacMall or any number of other online vendors and work very well and seemlessly with all Macs running OSX right up through Mavericks
    OR
    A Multifunction printer that is capable of receiving faxes to a desktop Mac like the HP 127 fw. Other models exist from other manufacturers as well, but you will have to do a bit of research and probably check with the vendor or user manual directly to confirm that Fax to desktop is supported for Mac and OS 10.9.
    2) A dedicated Mail Server (MSFT Exchange, Kerio, MacOSX server with mail server enabled, or PostFix enalber or MailServe from Cutedge Systems)
    You will need to set up an email account on your server that is the parent for all incoming faxes from which the faxes will be sent out as part of your digital workflow. This is beyond the scope of this discussion but if you've come this far and you're still reading, you probably know  how to do this already. I recommend setting this up as a POP account, not IMAP. This way, the attatchments (your faxes) will always remain on your server as a back up, until you delete them from the server.
    3) Now simply go to System preferences and select "Printers and Scanners". Select either the Fax printer for your multifunction printer, or add a fax printer/reviever using the + button and select "Fax" if you are using a USB modem. You must have the USB modem attatched to the computer in order to use the built-in Apple Fax feature for the latter option.
    4) Now click on the receive options. Select "Recieve faxes to this computer" and set your ring answer settings. Check "Save to" and select the designated folder (either Faxes or Shared Faxes on your computer) or create a new folder. Depending on the volume of faxes, and your back up systems, you may want to designate a separate folder on a separate drive, exclusively for your Faxes. This is where all your faxes will be stored.
    5) Now launch "Automator" in your applications folder and create a new workflow. You will be presented with several options. Select "Folder Action".
    6) At the top right of the window space you will see "Folder Action receives files and folders added to" . Select the Fax folder you created in step 4.
    7)On the left hand side of the "Actions" menu select "Mail"
    8) From the list of actions select "New Mail Message" this will take the latest Fax added to your Fax folder and attach it as a PDF to a new outgoing mail. In the "TO" address put the email address that belongs to the parent account your created for the Faxes on your mail server eg. [email protected].  In the subject field you can put "Fax Workflow" or any other generic subject that will identify to all reciptients that this is an email that contains a Fax/PDF attatchment.
    Under "account" use the SMTP account you set up on your mail server to handle the routing of internal emails. In most cases, this will be the same as the parent account created above. (Effectively, this account is sending and receiving emails to itself).
    9) From the list of actions, select "Send outgoing messages".
    10) Save the Automator workflow with a name like "FaxDistribution" or "FaxFlow".
    11) Go back to the Fax folder you created in step 4. Right click or option click on the folder and scroll down the options menu and select "Folder Actions Setup". You will see a list of scripts including the Automator workflow you just created. Choose it.
    That's it!! From now on, when you get a fax, it will get dumped into the designated fax folder, and this will automatically trigger the workflow to atttach and send it as an email to anyone in your office that is set up to receive emails with the "faxserver" address. You now have a paperless fax digital workflow server system for distributing your faxes digitally to anyone in your office who needs to review your faxes. Good luck!

    Thank you for this interesting posting.

  • Enterprise Solution for communication??

    Hi,
    We need to transfer pdf files between various environments(AS/400, WIN2K, AIX etc) with detail information about pdf files. Reciever will recieve the pdf file along with the details and populate that information into their database. reciever will have their own APIs to handle this. But we are looking for enterprise solution for communicating between these environments.
    Please give pros and cons in various technologies like JMS, SOAP, HTTP, etc..
    Thank you,

    You dont need to 'send' your report to the other environments, you are merely accessing the report from a shared file system.
    NFS allows you to share a common file system which might be located on one your existing systems. Or you can purchase an additional file server.
    NFS is quite open, and I believe supports all the systems you have. Finding NFS client software should be easy, with Windows probably the most complex. Several vendors (Hummingbird is the most notable with NFS Maestro) provides Windows based NFS clients.
    Once finished with the server and client architecture, design a file tree that supports staging and releasing the docs.
    for example
    /var/pdfdocs
    /release
    /staging
    NFS is very quick, and allows you to use standard file commands.
    The meta-data is stored in a standard relational db, which each system can update as sees fit.
    Read the following:
    NFS in general
    http://www.networkcomputing.com/912/912buyers2.html
    NFS on AS/400
    http://publib-b.boulder.ibm.com/Redbooks.nsf/RedbookAbstracts/sg242158.html?Open
    Windows NFS
    http://www.hummingbird.com/products/nc/nfs/index.html
    http://www.frontiertech.com/supernfs/product/prodguid.htm

  • What is teh best solution for my Problem??

    hello everyone
    i'am planing to develop a Patient Management System (PMS)
    since i 'm not that guru in the J technologies i would like you to get me some help with this decision:
    what is the best solution for developeing an intranet "PMS" which should handles data (connection to postgresql) and runs some algorithm (for planing puropose) im background ??
    1 developeing a web based solution (servlet and jsp and and ..)
    2 developeing an application wich will be manging everything ??? (a kind of programm with its own GUI)
    3 you may have an other idea solution and ill be gald for any advices
    ps sorry for my english its not that great ;)

    Web applications are all the rage these days and IMO rightly so as they greatly reduce deployment cost and headaches.
    They do have some drawbacks though which for you might be important.
    I can imagine that sooner or later (probably sooner in my experience) a requirement to interface with hardware installed on the client machines will emerge.
    Whether this is a barcode scanner or a chipcard reader to get client information from some form of card issued to the client, or maybe a label printer to print out labels for medication bottles doesn't matter, you need to access the client machine directly.
    Web applications have great difficulty with that (in fact, it's all but impossible).
    You could go for an applet solution but that's contrived at best and still leaves the same problems.
    So in your situation an old-fashioned client/server application (maybe using web services to communicate with the serverside application) is likely the best longterm solution.
    You'd have the database remote so everyone has access to the same data, being accessed through processes run on the server and called by the clients. These can also do most of the business logic, leaving the clients relatively dumb responsible mainly for data validation and entry and presentation of the results.

  • How can we suggest a new DBA OCE certification for very large databases?

    How can we suggest a new DBA OCE certification for very large databases?
    What web site, or what phone number can we call to suggest creating a VLDB OCE certification.
    The largest databases that I have ever worked with barely over 1 Trillion Bytes.
    Some people told me that the results of being a DBA totally change when you have a VERY LARGE DATABASE.
    I could guess that maybe some of the following topics of how to configure might be on it,
    * Partitioning
    * parallel
    * bigger block size - DSS vs OLTP
    * etc
    Where could I send in a recommendation?
    Thanks Roger

    I wish there were some details about the OCE data warehousing.
    Look at the topics for 1Z0-515. Assume that the 'lightweight' topics will go (like Best Practices) and that there will be more technical topics added.
    Oracle Database 11g Data Warehousing Essentials | Oracle Certification Exam
    Overview of Data Warehousing
      Describe the benefits of a data warehouse
      Describe the technical characteristics of a data warehouse
      Describe the Oracle Database structures used primarily by a data warehouse
      Explain the use of materialized views
      Implement Database Resource Manager to control resource usage
      Identify and explain the benefits provided by standard Oracle Database 11g enhancements for a data warehouse
    Parallelism
      Explain how the Oracle optimizer determines the degree of parallelism
      Configure parallelism
      Explain how parallelism and partitioning work together
    Partitioning
      Describe types of partitioning
      Describe the benefits of partitioning
      Implement partition-wise joins
    Result Cache
      Describe how the SQL Result Cache operates
      Identify the scenarios which benefit the most from Result Set Caching
    OLAP
      Explain how Oracle OLAP delivers high performance
      Describe how applications can access data stored in Oracle OLAP cubes
    Advanced Compression
      Explain the benefits provided by Advanced Compression
      Explain how Advanced Compression operates
      Describe how Advanced Compression interacts with other Oracle options and utilities
    Data integration
      Explain Oracle's overall approach to data integration
      Describe the benefits provided by ODI
      Differentiate the components of ODI
      Create integration data flows with ODI
      Ensure data quality with OWB
      Explain the concept and use of real-time data integration
      Describe the architecture of Oracle's data integration solutions
    Data mining and analysis
      Describe the components of Oracle's Data Mining option
      Describe the analytical functions provided by Oracle Data Mining
      Identify use cases that can benefit from Oracle Data Mining
      Identify which Oracle products use Oracle Data Mining
    Sizing
      Properly size all resources to be used in a data warehouse configuration
    Exadata
      Describe the architecture of the Sun Oracle Database Machine
      Describe configuration options for an Exadata Storage Server
      Explain the advantages provided by the Exadata Storage Server
    Best practices for performance
      Employ best practices to load incremental data into a data warehouse
      Employ best practices for using Oracle features to implement high performance data warehouses

  • Storage solutions for Aperture Library & vault??

    My Aperture library has grown to over 400 GBs and I am concerned about storage as it grows larger week by week. I have been using two 500 GB firewire 800 drives, one for the library and one for vault. Today I purchased a 1 Terabyte drive for the library, it works well and I actually see a speed improvement. I will pick up another next week for the vault.
    Aperture lumps all of the data in one folder therefore requiring the size of our hard drive to keep pace with the Library. I need a different solution than replacing firewire drives with larger capacity ones every three to six months.
    My main requirements are speed and expandability. I do not want to keep several aperture libraries, to me this defeats the purpose of an image database.
    I have heard the pros and cons of raid and I do not think it's the best solution for an Aperture based system. Sata sounds interesting and seems that it may become more standard. I have also read about using an out-dated mac as a server by installing several hard drives and connecting it to the workstation by gigabyte ethernet. I would like to have my library on a very fast connection, firewire is fine for the vault. I am wondering what others are doing in the same situation as mine and if you might have any advice.
    With the purchase of the 1TB drives I figure that I will be fine until mid to late summer, by that time Apple will have their new desktop available which I plan to purchase. It might be best to see what the machine will or will not support. Any thoughts?........
    G5 dual 2.0   Mac OS X (10.4.6)  

    As the previous poster said: this is an Apple problem that needs to be solved. I might add, it probably will be in the near future. So, you should try to scrape by for the next few months as best you can and wait and hope and pray for a solution from Apple.
    With DVD and Firewire drives backing up your original you are well--probably over--backed up. One copy on DVD is more than sufficient for your originals. Make more backups of the Aperture library which includes your original file and your time consuming metadata and adjustments.
    RAID 0 is risky because if a single drive goes then you loose everything! There are other RAID schemes that make your system more dependable. RAID 1 mirroring is safer but doesn't expand your storage capacity.
    RAID 3 & 5 are both more dependable and give you bigger working volumes but there is a lot of processor work going on to calculate parity (error correcting and data recovery information) that is spread across several drives. If you use Mac OS X RAID or Soft RAID that processor power comes from you computer's CPU--which probably has enough work to do already without having to crunch numbers for every byte that goes to/from your hard-drive.
    RAID 3 & 5 are good systems if you have hardware to to the parity calculation but they each require a minimum 4 drives to make a set--which you don't want to use as a boot drive. That means you would need 5 drives inside your G5 of a combination of internal and external drives.
    A cleaner solution might be getting a Wiebe Tech Siver SATA V case and port multiplier card (about $1500 w/o drives). That will give you a good fast connection for 5 external hot-swappable drives. Then use Ben Long's library spanner to get you by until Apple comes to the rescue with a multiple drive library.

Maybe you are looking for