Certification Authority Backup and Redundancy

Hi,
I have installed Certification Authority on One of my DCs (Windows 2008 R2 Standard), to serve certificated for Exchange and Lync and other applications, I have few questions if you guys can reply me.
1- What will happen if this server goes down , All certificated installed on Client server will stop working
2- How can i backup this CA server to restore it.
3- Is there any way that i can make redundant of this CA.
Regards
Usman Ghani
Usman Ghani - MCITP Exchange 2010

1 - Certificates cannot be validated anymore if the most recent revocation list (CRL) expires and the CA is not available to sign new CRLs. If you had used default settings delta CRLs are valid for one day so after one day application checking CRLs (not
all do!) would report issues.
2 - You should backup the CA's key and certificate (manually, only after setup or renewal, certsrv.msc). The registry key of the CertSvc service (config.) and the database should be backed up regularly (certutil -backupdb). Restoring the CA is similar to
migrating a CA to a new server: You import the key and add the role, using the option "Existing key and certificate".
3 - There is no option for 100% redundancy: Setting up a second CA (with a different cert. and key) only makes the service for issuing certificate high-available, but the second CA cannot sign CRLs on behalf of the first. (And you cannot have two CAs with
the same Subject name in AD). You could use Windows Clustering but in this case the database is on a Shared Storage - but I guess that is not an option anyway if the CA is on a DC.
I would rather recommend planning CRL validity periods and overlaps (new CRLs published while the existing one still valid) so that you would have enough time to restore the service in case of a disaster. If the CA goes down before a few days of bank holidays:
How long would it take for somebody to be notified and the backup to be restored? I would not use delta CRLs unless you plan for extremely frequent revocations and have bandwidth issues but rather use only base CRLs.
Elke

Similar Messages

  • Backup and Redundancy Oracle 11g

    Hi,
    1. If we set the redundancy to 2, does it mean 2 level 0 backups taken at 2 different points in time one after another or does it mean 2 copies of the same level0 backup?
    2. What is the effect of redundancy 2 on incremental backups? Are they also going to be 2 copies?
    Thanks & Regards,

    Vijay.Cherukuri wrote:
    Hi,
    Have you read the above link ?
    Assume a different case in which REDUNDANCY is 1. You run a level 0 database backup at noon Monday,
    a level 1 cumulative backup at noon on Tuesday and Wednesday, and a level 0 backup at noon on Thursday.
    Immediately after each daily backup you run a DELETE OBSOLETE. The Wednesday DELETE command does not remove
    the Tuesday level 1 backup because this backup is not redundant: the Tuesday level 1 backup could be used to recover
    the Monday level 0 backup to a time between noon on Tuesday and noon on Wednesday. However,
    the DELETE command on Thursday removes the previous level 0 and level 1 backups.
    Also please see : http://web.njit.edu/info/oracle/DOC/backup.102/b14191/rcmconc1007.htm
    As far i know RMAN does not delete backups if they are needed for recovery although they are obsolete (crossed retention policy )Not until you issue DELETE OBSOLETE.
    Please note that backups are not marked obsolete. They are evaluated for obsolescence when a command that references obsolescence (DELETE ... or REPORT ...) is issued. You can change your retention policy all you want, but until you actually DELETE OBSOLETE, nothing happens to those backups. Then they will be deleted (or not) as per the retention policy that is in effect at the time of the DELETE OBSOLETE command.
    That said, it is my understanding that if you keep your backups in the FRA, oracle will automatically delete obsolete backups if needed to free up space for newer backups. I've not personally investigated this so could be wrong. If this is the case, I would imaging that oracle still would not be deleting a backup as soon as it becomes obsolete, as that would require constant re-evaluation, but rather when it determines that it needs to find some space for a new backup.

  • Does OIM Connector for Lotus Notes support Domino certification authority?

    Lotus Notes allows an Organization to register servers and users without stamping each server ID and user ID if you have migrated the certifier to a Domino server-based certification authority (CA).
    A Customer has done such a migration to a server-based certification authority (CA), and therefore they have set up Notes and Internet certifiers to use the CA process.
    So, now this Customer does not require access to the certifier ID and ID password.
    Having enabled certifiers for the CA process, they can now assign the registration authority role to administrators, who can then register users and manage certificate requests without having to provide the certifier ID and password.
    My question is: is this compatible with the requirements of Oracle Identity Manager Connector for IBM Lotus Notes and Domino Release 9.0.4, that, among other parameters, requires to specify CertPath (Complete file specification of the certifier ID to be used when creating certifier ID files) and CertPwd (Password of the certifier ID file)?
    Regards,
    Angelo Carugati

    I quite new with OIM, but not at all... For sure, I need to configure a connector for Lotus Notes / Domino.
    The main points in my question are (USING A connector for Lotus Notes / Domino):
    - How can I create 1 user account (and related data), on different servers (IT Resources), with different "mail templates", when the data should be the same, and the user mail database, should only be a replica on the the 2nd server
    - Maybe, I need to configure 2 distincts IT Resources, and run both (through Provisioning Policies), when I need to provision a user, as described in my scenario (above), right?
    - In the 2nd server, I dont want the user to be created with a new mail database (neither new user data, as shortName, IDfile... ).
    I want that same data, and a replicated mail DB is generated on the 2nd server (webmail server)
    Is it possible, how can I configure this within OIM connector for Lotus Notes / Domino?

  • Using Hyper-V 2012 r2, connecting to the console results in: A certification authority could not be contacted for authentication.

    I'm having some trouble with authentication to guests from my Hyper-V console.
    If I try to connect from the Hyper-V Manager to the console of any guest, I get the error:
    "A certification authority could not be contacted for authentication. If you are using a Remote Desktop Gateway with a smart card, try connecting to the remote computer using a password. For assistance, contact your system administrator or technical support."
    I'm not using an RDG and smart card.
    I have 2 virtual networks. The first is Production, the second is Isolated. Production has 2 NICs attached to the Production LAN, the second has 2 NICs in our DMZ. The host is a member server of the production domain. I can use MSTSC from the LAN or the DMZ
    to gain access to each Guest and the Host.
    The issues start if I try "Connect" from Hyper-V Manager in an attempt to use the console of any Guest. Each attempt fails with the above error. If I use an incorrect password, I get a different error: "The credentials that were used to connect
    to {Server FQDN} did not work. Please enter new credentials."
    Taking a look at the the event logs, I can see the session successfully authenticating to the Guest (4776 Credential validation and 4624 Logon), and the fact I get a different error if I enter an incorrect password show I get some way along the line. However
    if I take a look at the logs on the Host, however I get:
    An account failed to log on.
        Subject:
            Security ID:        NULL SID
            Account Name:        -
            Account Domain:        -
            Logon ID:        0x0    
        Logon Type:            3
        Account For Which Logon Failed:
            Security ID:        NULL SID
            Account Name:        
            Account Domain:        
        Failure Information:
            Failure Reason:        An Error occured during Logon.
            Status:            0xC000006D
            Sub Status:        0xC000005E
        Process Information:
            Caller Process ID:    0x0
            Caller Process Name:    -
        Network Information:
            Workstation Name:    -
            Source Network Address:    -
            Source Port:        -
        Detailed Authentication Information:
            Logon Process:        Kerberos
            Authentication Package:    Kerberos
            Transited Services:    -
            Package Name (NTLM only):    -
            Key Length:        0
        This event is generated when a logon request fails. It is generated on the computer where access was attempted.
        The Subject fields indicate the account on the local system which requested the logon. This is most commonly a service such as the Server service, or a local process such as Winlogon.exe or Services.exe.
        The Logon Type field indicates the kind of logon that was requested. The most common types are 2 (interactive) and 3 (network).
        The Process Information fields indicate which account and process on the system requested the logon.
        The Network Information fields indicate where a remote logon request originated. Workstation name is not always available and may be left blank in some cases.
        The authentication information fields provide detailed information about this specific logon request.
            - Transited services indicate which intermediate services have participated in this logon request.
            - Package name indicates which sub-protocol was used among the NTLM protocols.
            - Key length indicates the length of the generated session key. This will be 0 if no session key was requested.
    Which looks to me like a blank authentication request is being sent? (I've not deleted any machine/domain names, they're just not present)
    Any suggestions? Do you think I'm barking up the wrong tree?
    Thoughts and comments gratefully received

    Hi,
    What’s your guest system platform, base on my experience that must be the not supported guest system issue, the generation 2 vm only support the Windows 8 or 8.1 platform.
    The related KB:
    Generation 2 Virtual Machine Overview
    http://technet.microsoft.com/en-us/library/dn282285.aspx
    Hope this hleps.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • IMac 5,1 Backup and extension of hard drive strategy

    I have an iMac 5,1 500 GB, 20-in running Time Machine on a Seagate 1TB FreeAgent. Both have worked great since I got them. I recently decided I needed a more robust backup strategy and also needed to consider extending my hard drive space. After researching this site, CNET, Amazon and others, I narrowed it down to four including LaCie Rugged 1T, WD My Passport Studio 1T, G-Drive 1T and OWC Mercury Elite. I decided on buying two external drives. One is the 1.0TB (1000GB) OWC Mercury Elite Pro High Performance 7200RPM FireWire 400+USB2 Solution, and the other is 2.0TB (2000GB) OWC Mercury Elite Pro 'Quad Interface' 5900RPM 64MB Cache Solution with eSATA/FW800/FW400/USB2.
    My question is, what is the best way to make use of these three external drives to have redundant backup and to extend my hard drive, which is now 40G from its limit? What I want to accomplish is
    a TM backup
    a clone
    a second clone
    more usable hard drive space
    My thought was to do the following:
    Keep TM the way it is with the Seagate 1TB
    Partition the OWC 1T such that I can clone onto it and also have some usable space to extend my hard drive
    Partition the OWC 2T such that I can clone the iMac and clone the 1T partition that is hard drive extension, if that makes sense.
    Am I making this too complicated? Is there a better way? Is it OK to partition an external hard drive for both clone use and extra usable space? Can I clone from the 1T to the 2T as I suggested?
    Thanks for your suggestions. This is my first post that I hope doesn't get me thrown off the boards.

    I implemented the backup plan I outlined above but have experienced an issue I can't resolve. For some reason, the OWC Elite AL Pro 1T will not boot with the FW400 connection even though it appears in the Startup options. I does not appear as an option when I hold the Option key on startup. If I connect via USB, it works fine and boots as expected.
    So, I called OWC Support and they did not have a solution. I decided to return the OWC 1T and bought another 2.0TB (2000GB) OWC Mercury Elite Pro 'Quad Interface' 5900RPM 64MB Cache Solution with eSATA/FW800/FW400/USB2. The other one I bought cloned and booted just fine.
    So now I am experiencing the same issues with the new 2T drive that I experienced with the 1T. Connecting via USB works fine, but connecting via FW400 will not boot. The drive is recognized in Finder and otherwise works as expected when connected via FW.
    Anyone have any ideas as to solution?
    Thanks.

  • Purpose of Retention Policy Recovery Window and Redundancy

    Hi,
    Good Evening,
    I have some queries regarding the RMAN Retention Policy Recovery Window and Redundancy.
    1. Any condition is there to set the Retention Policy Recovery Window and Redundancy and control_file_record_keep_time?What is the relationship between these 3 parameters?
    2. Explain the scenario if i set the control_file_record_keep_time=4 Redundancy=3 and Recovery Window=7?
    3. If i set the Redundancy=3 and Recovery Window=7 means my backup place only have 3 copies of backup based on the redundancy then what is the purpose of Recovery Window=7 please give some example.
    4. If i change the values for Recovery Window=3 and Redundancy=7 what will happened, how many days backup will be available in my FRA location?Explain with one scenario?
    Thanks in advance.
    Vijay.

    Hi,
    Take a look of the above doc contents:
    Configuring the Backup Retention Policy
    As explained in "Backup Retention Policies", the backup retention policy specifies which backups must be retained to meet your data recovery requirements. This policy can be based on a recovery window or redundancy. Use the CONFIGURE RETENTION POLICY command to specify the retention policy.
    so  you have option to choose either  recovery windows or redundancy based you can set the configuration like
    read in the Doc What it said for both:
    Recovery Window-Based Retention Policy ==>RMAN does not consider any full or level 0 incremental backup as obsolete if it falls within the recovery window.  Additionally, RMAN retains all archived logs and level 1 incremental backups that are needed to recover to a random point within the window.
    Redundancy-Based Retention Policy==>The REDUNDANCY parameter of the CONFIGURE RETENTION POLICY command specifies how many full or level 0 backups of each datafile and control file that RMAN should keep. If the number of full or level 0 backups for a specific datafile or control file exceeds the REDUNDANCY setting, then RMAN considers the extra backups as obsolete. The default retention policy is REDUNDANCY 1.
    RMAN> show RETENTION POLICY;
    using target database control file instead of recovery catalog
    RMAN configuration parameters for database with db_unique_name DDTEST are:
    CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 7 DAYS;
    RMAN> CONFIGURE RETENTION POLICY TO REDUNDANCY 3;
    old RMAN configuration parameters:
    CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 7 DAYS;
    new RMAN configuration parameters:
    CONFIGURE RETENTION POLICY TO REDUNDANCY 3;
    new RMAN configuration parameters are successfully stored
    RMAN> show RETENTION POLICY;
    RMAN configuration parameters for database with db_unique_name DDTEST are:
    CONFIGURE RETENTION POLICY TO REDUNDANCY 3;
    RMAN> CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 7 DAYS;
    old RMAN configuration parameters:
    CONFIGURE RETENTION POLICY TO REDUNDANCY 3;
    new RMAN configuration parameters:
    CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 7 DAYS;
    new RMAN configuration parameters are successfully stored
    RMAN> show RETENTION POLICY;
    RMAN configuration parameters for database with db_unique_name DDTEST are:
    CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 7 DAYS;
    CONTROL_FILE_RECORD_KEEP_TIME:This parameter applies only to records in the control file that are circularly reusable (such as archive log records and various backup records) ref Doc:CONTROL_FILE_RECORD_KEEP_TIME
    1. Any condition is there to set the Retention Policy Recovery Window and Redundancy and control_file_record_keep_time?What is the relationship between these 3 parameters?
    2. Explain the scenario if i set the control_file_record_keep_time=4 Redundancy=3 and Recovery Window=7?
    3. If i set the Redundancy=3 and Recovery Window=7 means my backup place only have 3 copies of backup based on the redundancy then what is the purpose of Recovery Window=7 please give some example.
    4. If i change the values for Recovery Window=3 and Redundancy=7 what will happened, how many days backup will be available in my FRA location?Explain with one scenario?
    so i believe you can get the Answer from Your Question from Above details.
    HTH

  • Online Backup and Sharing issue

    I am having trouble with the restore part of my Online Backup and Sharing.  Whether I select specific files or a folder, the history shows that the backup was successful, but when I try to find it in the list of files that can be restored, the files that I want are not there.
    I can see the structure of folders and sub-folders, but the specific files are not anywhere in my restore list.  
    Also, there seems to be two different structures with some of my folders in one and some in another with some redundancy.  Neither is complete.
    Is there a way that I can wipe what is in my online storage completely clean and start over?

    There is a specialized team who troubleshoots issues with the backup and sharing program.
    A dedicated internal phone number (866-770-6800) has been provided for hand-offs to the VOL Backup & Sharing Technician.
    Hours of Operation are Mon - Sat 7am - 11pm EST.
    After hours the customer will be asked to leave a message and someone from VOBS support will return their call during business hours.
    Anthony_VZ
    **If someones post has helped you, please acknowledge their assistance by clicking the red thumbs up button to give them Kudos. If you are the original poster and any response gave you your answer, please mark the post that had the answer as the solution**
    Notice: Content posted by Verizon employees is meant to be informational and does not supersede or change the Verizon Forums User Guidelines or Terms or Service, or your Customer Agreement Terms and Conditions or plan

  • Why do other browsers ( IE, Chrome, Opera,Safari) list StartCom Class 2 Primary Intermediate Server CA as a Trusted Intermediate Certification Authority but Firefox doesn't?

    We are setting up registrations for a paid event and have bought a SSL certificate for our site. Everything works fine when the registration page is accessed through IE, Chrome, Opera or Safari (which list StartCom Class 2 Primary Intermediate Server CA as a Trusted Intermediate Certification Authority), but when I click on that link in Firefix I get the "This Connection is Untrusted" page because only StartCom Class 1 is listed as trusted.
    Why is that?

    It is always the responsibility of a website to send the complete certificate chain.
    You can check the certificate chain of breastfeedingconference.asn.au and see that the server doesn't send the intermediate certificate.
    * http://www.networking4all.com/en/support/tools/site+check/

  • Ideas for Providing User Level Data Backup and Restore

    I'm looking for ideas for implementing a user level application data backup and restore in an Apex app.
    What would be great is to have a user be provided an export file and a way to import this file. A bit overkill but hopefully never needed.
    Another option that is perfectly doable is a report that simply provides a means to create an export of the data. Since I already have an interface I can use an export to interface an export.
    Any thoughts?
    Hopefully I'm missing something already there for an end user to use.

    jlincoln wrote:
    "Do you mean "export" and "import" colloquially, or in the specific sense of the exp/imp/datapump utilities?"': I mean as in imp/exp Oracle utilities. Generally speaking, it would be neat to be able to export and import via an Apex an application. In this hosted environment I don't have that access but would this be a bad idea if you don't care about the existing data in the schema in which the data resides?I can envisage a mechanism using <tt>exp/imp</tt>, but since it requires <tt>dbms_scheduler</tt> external jobs and access to the file system it's highly unlikely to be possible in a hosted environment. (Unless you're doing the hosting?)
    Backup: Necessary for piece of mind and flexibility. I am working on a VB/Access user who does this today to get to the point when they can be comfortable with the backups occurring regularly and by the hosting site's DBA group.
    Restore: Like I said. I am working on a VB/Access user who does this today to get to the point when they can be comfortable with the backups occurring regularly and by the hosting site's DBA group. This is a very small data set. A restore would simply remove existing data and replace it with the new data.My opinion is that time would be better spent working on the user rather than a redundant backup and restore feature. Involve them in a disaster recovery exercise with whoever is hosting the environment to prove that their data is safe. Normally the inclusion of data in regular, effective database backups is sold as a major feature of APEX solutions.
    "What about security/privacy when this data ends up in uncontrolled environments?": I don't understand the point of this question. The data should not end up in uncontrolled environments. Just like the data in the database or its backups.Again, having data in a central, shared location protected by multiple levels of application, database, and OS security is usually seen as a plus for APEX over VB/Access. Exporting the data in toto to a PC/laptop that can be stolen or lost, and where it can be copied to USB drives/phones/email loses this protection.
    User Level: Because the end user must have access to the backup and restore mechanisms of the application.
    Application Data: The application data. Less than 10MB. Very small. It can be exported in a flat file downloaded by the end user. This file can then be used to upload and import via an existing application interface. For example.
    "I'm struggling to parse this for meaning.": When I say I have an existing interface I am referring to a program residing in the Apex application that will take data from a flat table structure (i.e. interface table), validate the data, derive data, and load into the target table structure.Other than the report export capability linked to above, there's nothing built-in to APEX that comes close to your requirement. If the data is simple enough that it can be handled in such a report, and you have a process that can read and recreate this export, then you have your backup/restore capability. If the data can't be handled in a simple report, then you'll need a more complex PL/SQL process to generate the file.

  • Move Certification Authority Web Enrollment to new server issue.

    Hello, 
    i'm trying to move the Certification Authority Web Enrollment  from one server to a new one. I've got a fully functional server where i can enroll any certificate i want and everything is working properly.
    on the new server i configured I'm facing a problem that seems to be an impersonation issue. Indeed, while i try to enroll a certificate i get the following error msg from the interface :
    Request Mode:
    newreq - New Request 
    Disposition:
    (never set) 
    Disposition message:
    (none) 
    Result:
    The RPC server is unavailable. 0x800706ba (WIN32: 1722) 
    COM Error Info:
    CCertRequest::Submit: The RPC server is unavailable. 0x800706ba (WIN32: 1722) 
    LastStatus:
    The operation completed successfully. 0x0 (WIN32: 0) 
    Suggested Cause:
    This error can occur if the Certification Authority Service has not been started. 
    an i can also see on the CA it targets the following  application error event :
    Event 18209, ComRuntime:
    The application-specific permission settings do not grant Local access permission to the COM Server application C:\Windows\system32\certsrv.exe with APPID 
    {D99E6E74-FC88-11D0-B498-00A0C90312F3}
     to the user NT AUTHORITY\ANONYMOUS LOGON SID (S-1-5-7) from address LocalHost (Using LRPC). This security permission can be modified using the Component Services administrative tool.
     While i register a certificate on the server were it all works fine i can see event in the Security log on the CA that authenticate the user i generate the certificate with, where-as with the server were it does not work, all seems to be anonymous.
     IIS configuration are identical on both servers and the delegation has been set identically too ( ADUC object )
     Any idea how what I could check next? 

    Hi,
    Regarding event 18209, please follow steps from this article below to assign access permissions for the user mentioned in the event message:
    Event ID 18209 — COM Security Policy Configuration
    http://technet.microsoft.com/en-us/library/cc726319(v=WS.10).aspx
    Best Regards,
    Amy
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Upgrading PowerShell 2.0 to 3.0 on a Windows Server 2008 SP 2 Enterprise Certification Authority server

    Hello All:
    Are there any caveats to upgrading PowerShell 2.0 to 3.0 on a customer's Certification Authority server? The customer will also be upgrading to SCCM 2012  and employ this server as a Distribution Point.
    Any feedback would be greatly appreciated.
    Thank you.

    Hi Erik,
    I haven't tried to upgrade powershell on Certification Authority server, however, Windows Management Framework 3.0 requires Microsoft .NET Framework 4.0, and you need to change .NET version on server 2008 SP2.
    For more detailed installation instruction, please follow this article:
    Windows Management Framework 3.0
    If there is anything else regarding this issue, please feel free to post back.
    Best Regards,
    Anna Wang

  • 2 ISP load balancing and redundancy

    Hello!!
    Our small company has about 40 branches spreaded within city. Branches are connected by optic wire supplied by our ISP. So in ISP our branches are located in one VLAN. From every branch we created VPN tunnel to our server room in central office. Central office is like a cetner point. If optic wire fails to central office, there would no VPN tunnels and no network to all branches. Moreover, all the traffice goes through central office.
    Now we decided to pave one more optic line to our central office. And that will increase bandwidth and redundancy.
    Private network topology: There are no default gateways and ip-addresses. For examle, at first branch I will plug computer directly into media converter and at the second branch plug another computer to the media converter. After that this two computers became in one network. And can assign any ip addresses to them.
    What I have: our firewall do enough work, don't want to overload it. But we have some free ports in our new cisco 3750. The question is how to do load balancing and redundanccy? Can it do load balancing according to traffic? And how load balance incoming traffic? For example, connection was established from branche's router, how this router will choose through which line make connection? By the way, at all branches we use noisy cisco
    3700 series routers.

    Sorry for upping 1 year old threat.
    We talked to our Network Provider. They said "these two cables are coming from two different places, so there is no way to use etherchannel. You must use active-standby solution."
    Relying on STP we just put two cables into 3750 stack. But with default STP settings, connection was very unstable, many packet losses and disconnections. So we found easy solution with "flex links", making one interface backup of the other. And only now I recognized that this is not a failover solution. Because, if network beyond media converter will down, link from media converter to switch would still up.
    What could I do to make our L2 WAN redundant? Are there any additional STP settings.

  • Usefullness of Certification Authority Web Enrollment?

    If a deployment has Certificate Enrollment Web Service and
    Certificate Enrollment Policy Web Service installed is there still a need for
    Certification Authority Web Enrollment?  This Windows Server 2012 CA design has an offline root CA, two Enterprise Subordinate CAs in a cluster, and two web servers hosting AIA/CDP/OCSP/CES and CEP behind a load balancer.  There is
    also a standalone NDES server.
    Thanks

    Starting with Windows Server 2008, web enrollment become useless as it allows only user certificates, therefore you should avoid web enrollment installation whenever it is possible. As for CEP/CES, there is a dependency that only Windows 7+ supports it.
    My weblog: http://en-us.sysadmins.lv
    PowerShell PKI Module: http://pspki.codeplex.com
    Check out new:
    PowerShell FCIV tool.

  • What is the certification authority, the third party that can confirm the digital signature?

    I created a nice electronic signature, that I now regularly use and add to every document. I was told that a signature needs to be issued by a verification authority, a third party that is able to verify the signature, certificate. I created a free certificate at CAcert.org and tried to combine it with the adobe signature certificate file, but it doesnt support .cer and .crt files. Is the Adobe the certification authority in this case since i created signature in the Adobe software? Its not a big deal, I just want everything to be correct since I use the signature in official documents now (instead of scanning a signed document) ... Thanks for any info, ideas or help.
    Jacob

    Each Digital Certificate has a pair of private and public keys used for encryption/decryption. The private key belongs to the certificate owner and should be kept secret. It is protected by a password. The public key can be used by anyone. Digital certificates come in two flavors: one that contains both private and public key and one that contains only public key.
    When you create a digital signature the signing process uses the private key to encrypt the signed content digest and the public key is used to decrypt it. So, only you can encrypt signed content with your certificate that has both private and private keys and anyone can decrypt it to validate the signature using certificate that has only public key. Usually, this certificate with the public key only is embedded in the digital signature, so that anyone can use it for decryption.
    The .cer certificate contains only public key. Certificates with both private and public keys usually have extensions .pfx or .p12. You need one of those to sign.
    CAcert.org issues only public key certificates. so you cannot use its certificates for digital signing.
    Adobe is not a general purpose certification authority. It issues some certificates for internal use only.
    Acrobat has a feature that allows you to create so-called self-signed certificates with both private and public keys but these certificates can be used only in a limited way. They do not provide the means to authenticate the real certificate owner nor revoke a certificate if it is stolen.
    Generally, a digital signature asserts three main features:
    1. Document integrity (document has not been changes since it had been signed),
    2. Authentication (the signer is indeed what the certificate says)
    3. Non-repudiation (the signature author cannot deny that he signed it: this is achieved via certificate revocation mechanism).
    A self-signed certificate (of the type that Acrobat produces) can be used only for #1. It cannot be used for ##2 and 3. The latter two come only when a certificate (with private key) is issued by a reputable Certificate Authority which is trusted (like VeriSign, Symantec, etc.).

  • Firefox does not recognize SSL Certificate issuer Entrust Certification Authority – L1K, but Entrust Certification Authority – L1C is ok?

    We have a new Entrust SSL Certificate with issuer Entrust Certification Authority – L1K which Firefox does not recognize. Internet Explorer and Chrome are ok.
    On a different system we have an Entrust SSL Certificate with issuer Entrust Certification Authority – L1C which is ok with Firefox.

    Did you verify that all intermediate certificates are installed on the server?
    You can inspect the certificate chain via a site like this:
    *http://www.networking4all.com/en/support/tools/site+check/
    *https://www.ssllabs.com/ssltest/

Maybe you are looking for