Downstream Capture using archivelog files

DB Version 11.2
I have not been able to find a demo/tutorial of using Streams with Downstream Capture where only archivelogs from the source DB are available (they were moved to a remote system using a DVD/CD/tape). All the examples that I have been able to find use a network connection.
Does anyone know of such an example?
Thank you.

Hi!
Please can you elaborate your question more clearly..
Explanation of Downstream Capture...
I am assuming that we are having two databases one is production database and one is DR database (disaster database).
we want that changes from Production database should get replicate to DR database.
For performance reasons we want that no process should be running on Production.
In order to achieve that we use downstream capture in which capture and apply both process will be running on DR database.Now here problem arises that how that capture process will capture changes from production database.
so we configure data guard configuration b/w these two databases.production database should be in archive-log mode where as DR database can be in no-archive log mode.now archive log file from production database are copied to DR database using network connection automatically and then from these archives capture process captures changes.
Now why do you want archives to be copied on DVDs...
regards
Usman

Similar Messages

  • Downstream capture

    Hi all!
    We have a non-rac production database and i've successfully done duplicating it to a asm rac using rman duplicate. Now, my plan is to make the rac instance as our report server and to configure it for advance replication using downstream capture from our production server.
    Duplicating my prod server takes time (more than 12hrs). How will i make my rac instance as fresh as my prod? The 12hrs of restoration creates a gap between my prod and my report server. Will the SCN mentioned on using downstream capture will solve it? As i can understand, downstream capture uses archivelogs. Or do i need to roll forward my rac instance using incremental backup? I need my report server as fresh as my prod because users generate reports everyday.
    Please guide me.
    Thanks a lot!

    Hi,
    Golden gate is a good solution for replication, I have used it. Its straight forward than streams.
    You need to dive in and explore since you wont find that much practical documentation on it . This installation guide might help you for start http://download.oracle.com/docs/cd/E18101_01/doc.1111/e17799.pdf
    Regards,
    Mansoor

  • Downstream Capturing - Two Sources

    Hi
    We are planning to have two source databases and one consolidated/merge database.We are planning to use streams.I just configured downstream real time capture from one source database.
    As I can have only one real-time downstream capture process and one archive-log downstream capture process.
    How do I configure this archivelog-downstream capture process.Where in the code that differentiates between real time and archivelog.
    Thanks
    Sree

    You will find the steps for configuring downstream capture in the 11.2 Streams Replication Administrator's Guide towards the end of chapter 1. Here is the URL to the online doc section that gives examples of the source init.ora parameter log_arch_dest_2 for real-time mining and archive log mining:
    http://docs.oracle.com/cd/E11882_01/server.112/e10705/prep_rep.htm#i1014093
    In addition to the different log_arch_dest_* parameter settings between real-time and archive log mining, real-time mining requires standby logfiles at the downstream mining database. The instructions for that are also in the Streams Replication Administrator's Guide.
    Finally, the capture parameter DOWNSTREAM_REAL_TIME_MINE must be set to Y for real-time mining. The default is N for archive log mining.
    Chapter 2 in that same manual includes how to configure downstream capture using the MAINTAIN_SCHEMAS procedure

  • Using archivelog and control file from other Oracle server

    I am still bothered with my backup process.
    I have 2 AIX boxes (same model, say A and B); both have BAAN 5 and Oracle 10g R2 on. Right now my colleague insists to use the export pump (cold backup) from Prod Oracle server (A) to restore the Oracle server on Box B. The Prod server has the archivelog mode turn on. But it will miss any transaction data from import pump till the crash point of Box A. So this is my confusion.
    Can I pass the control files and archivelog files form Box A (prod server) to Box B and use them to restore the Box B as the latest Prod server? How?
    I tried to convince them to use the RMAN backup? But not successful?
    I think the best way is probably to use the Oracle Data guard. However, there is always one concern to my manager and colleague, that such process will cause the data on the restored server (failover, Box B) not recognizable by the BAAN, which define the objects (tables).
    Thanks

    Performing a logical backup is not useful to restore to the point of failure. The only valid and available option is a hot backup/archivelog mode. Your recovery manager backup perform a controlfile and redologfile backup, so those can be restored at the destination. You must take care of the way you perform the backup, and ensure the paths where your backup is being deposited are visible by the second node. A shared storage with same mount points is suitable in this case. A tape robot configured at both nodes is also a suitable solution.
    Recovery manager perfoms a controlfile and spfile restore, too. This rman command perform the action:
    SET DBID <DBID of the database,for which you want to restore the controlfile>;
    RESTORE CONTROLFILE FROM <name_of_backupiece_which_contains_the_controlfile backup>';
    I don't see any problem on the recovery manager side, and technically speaking, on the Oracle side it is perfectly possible to restore your database at a remote location. I don't know what happens on the BAAN side, if you are required to have it configured to be operative on the target node. You could try to clone your database at the node B, configure BAAN and prepare the proceduere in case of failure.
    Configuring a dataguard is also a recomended action. So it is to think about Cold Failed Over clusters. I have recently performed a CFC configuration with BAAN. No problem it works smoothly.
    ~ Madrid.

  • Configure log file transfer to downstream capture daabase!

    Dear all,
    I am setting up bidirectional replication among 2 database server that are Linux based and oracle 11gR1 is the database.
    I am following the document Oracle Streams Administrator Guide, Have completed all the pre-configuration tasks but I have confusion with this step where we have to configure log file transfer to downstream capture database.
    I am unable to understand this from the documentation.
    I mean how do I configure Oracle Net so that the source database can communicate with each other in bi-directional replication?
    Configure authentication at both databases to support the transfer of redo data? How can I do this?
    Third thing is the paramter setting that obviously i can do
    Kindly help me through this step.
    Regards, Imran

    and what about this:
    Configure authentication at both databases to support the transfer of redo data?
    Thanks, ImranFor communication between the two databases, you create streams administrator at both the databases. The strmadmin talk to each other.
    Regards,
    S.K.

  • Using RMAN to delete unwanted archivelog files..??

    Hi All,
    How can i remove the unwanted archivelog files from the disk, to manage the disk space usage, using RMAN?
    My 10g database is in ARCHIVELOG mode and the OS is RHEL ES Release 3
    when i tried with 'delete expired archivelog all;', i got the result as follows:
    RMAN> delete expired archivelog all;
    released channel: ORA_DISK_1
    allocated channel: ORA_DISK_1
    channel ORA_DISK_1: sid=363 devtype=DISK
    specification does not match any archive log in the recovery catalog
    Please update...
    Many Thanks in advance.....!!!

    with the delete all input, can i assume that the archivelog files has been got deleted...Don't you trust Oracle ? :-)
    You should see two series of messages, like below
    channel ORA_DISK_1: starting archive log backupset
    channel ORA_DISK_1: specifying archive log(s) in backup set
    input archive log thread=1 sequence=73 recid=1 stamp=599697994
    input archive log thread=1 sequence=74 recid=2 stamp=599698064
    input archive log thread=1 sequence=75 recid=3 stamp=599698103
    input archive log thread=1 sequence=76 recid=4 stamp=599698138
    input archive log thread=1 sequence=116 recid=44 stamp=600271740
    input archive log thread=1 sequence=117 recid=45 stamp=600271859
    input archive log thread=1 sequence=118 recid=46 stamp=600277637
    channel ORA_DISK_1: starting piece 1 at 04-SEP-06
    channel ORA_DISK_1: finished piece 1 at 04-SEP-06
    piece handle=/home/ora102/flash_recovery_area/DB102/backupset/2006_09_04/o1_mf_annnn_TAG20060904T154722_2hrcmh13_.bkp tag=TAG20060904T154722 comment=NONE
    channel ORA_DISK_1: backup set complete, elapsed time: 00:00:37
    channel ORA_DISK_1: deleting archive log(s)
    archive log filename=/home/ora102/flash_recovery_area/DB102/archivelog/2006_08_28/o1_mf_1_73_2h6ok83j_.arc recid=1 stamp=599697994
    archive log filename=/home/ora102/flash_recovery_area/DB102/archivelog/2006_08_28/o1_mf_1_74_2h6omjb9_.arc recid=2 stamp=599698064
    archive log filename=/home/ora102/flash_recovery_area/DB102/archivelog/2006_08_28/o1_mf_1_75_2h6onq39_.arc recid=3 stamp=599698103
    archive log filename=/home/ora102/flash_recovery_area/DB102/archivelog/2006_08_28/o1_mf_1_76_2h6oosvn_.arc recid=4 stamp=599698138
    archive log filename=/home/ora102/flash_recovery_area/DB102/archivelog/2006_09_04/o1_mf_1_116_2hr5tw8c_.arc recid=44 stamp=600271740
    archive log filename=/home/ora102/flash_recovery_area/DB102/archivelog/2006_09_04/o1_mf_1_117_2hr5ym80_.arc recid=45 stamp=600271859
    archive log filename=/home/ora102/flash_recovery_area/DB102/archivelog/2006_09_04/o1_mf_1_118_2hrcm58z_.arc recid=46 stamp=600277637
    Finished backup at 04-SEP-06

  • Downstream Archivelog files

    I am setting up streams with downstream capture. I have archive logs being posted to the ASM filestore on the downstream database (through log_archive_dest_2 set on primary). These downstream logs are mounting up and I can;t seem to find a way of listing / deleting them on through RMAN (either on the primary or downstream database). Oracle support have told me that you have to manage them manually. It's a bit cheeky disbelieving them and posting the same question on the forum but I thought there must be some way of automating this.
    Cheers
    Simon

    Simon have you solved this issue?
    We are seeing the same behavior and have a SR open with Oracle
    as per Note:421176.1 RMAN should infact delete the archivelogs on the source side

  • Problems with .dv file captured using Capture Magic HD from Panasonic Cam

    I am trying to capture video from my Panasonic AG-HVX200P Camera using a program called Capture Magic HD, and I want to capture using the Raw DV or M2T Stream file type setting. There is another option of Write DV as Quicktime setting, but takes forever to write. When I get the .dv footage, it has like a rainbow color look to it. Does anybody think that it's the program Capture Magic HD having problems, or my mac?

    My response in these discussions is always, "record to tape." You can solve any issue in post if you have the tape. If you don't have the tape, you're stuck with whatever you decided to try to capture. In many dozens of "capture live" threads we've entertained here over the years, I cna only recall a few where the user was actually satisfied they captured to the drives instead of to tape. All others ended in terrible, totally avoidable disasters.
    But I digress.
    You're saying this application digitizes and writes a separate .m2t file for each of the video source inputs?
    I have never heard of .m2t so I looked these up. the most interesting is that is seems to be a variant of the regular ol' HDV disguised as MPEG2. You might try simply changing the file extension or the Open With setting.
    from the cow:
    Okay, believe it or not, the solution is as simple as this - rename the .m2t to .mpg Ya, that's it. Now try to load it and After Effects has no trouble recognizing it. I'm going to delve into Vegas to see if it can change the extension on files it captures to save a few keystrokes but at least I'm back to editing rather than agonizing over having to render my files to another format before taking them to AE. <</div>
    Respond to this post Return to posts index
    Re: Sony m2t file workaround
    by Dave LaRonde on Jun 9, 2008 at 2:33:12 pm
    [Craig Stewart] "the solution is as simple as this - rename the .m2t to .mpg "
    That's not a really terrific solution if you intend to use the footage in AE. Read on:
    Dave's Stock Answer #1:
    If the footage you imported into AE is any kind of the following -- Native HDV, MPEG1, MPEG2, mp4, m2t, H.261 or H.264 -- you need to convert it to a different codec.
    These kinds of footage use temporal, or interframe compression. They have keyframes at regular intervals, containing complete frame information. However, the frames in between do NOT have complete information. Interframe codecs toss out duplicated information.
    In order to maintain peak rendering efficiency, AE needs complete information for each and every frame. But because these kinds of footage contain only partial information, AE freaks out, resulting in a wide variety of problems.
    So..... just to make sure if I've got this straight:
    You're proposing to change the suffix from .m2t to .mpg..... which is MPEG, yes? A temporally-compressed codec, yes?
    It sounds to me like you'd just be swapping one problem for another.
    Dave LaRonde
    Sr. Promotion Producer
    KCRG-TV (ABC) Cedar Rapids, IA
    <
    <div class="jive-quote">After changing the .m2t extension on samples video files to .mpg, I was also able to view them on the RealOne Player version 2.<
    It’s an Mpeg file for high definition Television. If you have a JVC HD camcorder, files downloaded to a pc are converted to this format.</div>
    <
    <div class="jive-quote">File Description High-definition video recording format used by many HD camcorders; commonly referred to as "HDV;" uses MPEG-2 compression to store HD video data on DV or MiniDV tapes; supports resolutions of 720p and 1080i. <</div>

  • Source DB on RAC, Archived Log Downstream Capture:Logs could not be shipped

    I don't have much experience in Oracle RAC.
    We are implementing Oracle Streams using Archived-Log Downstream capture. Source and Target DBs are 11gR2.
    The source DB is in RAC (uses scan listeners).
    To prevents, users from accessing the source DB, the DBA of the source DB shutdown the listener on port 1521 (changed the port number to 0000 in some file). There was one more listener on port 1523 that was up and running. We used port 1523 to create DB link between the 2 databases.
    But, because the listener on Port 1521 was down, the archived logs from the source DB could not be shipped to the shared rive. As per the source DB DBA, the two instances in RAC use this listener/port to communicate with each other.
    As such, when we ran DBMS_CAPTURE_ADM.CREATE_CAPTURE procedure from the target DB, the Logminer Data Dictionary that was extracted from the source DB to the Redo Logs was not avaialble to the target DB and the streams implementation failed.
    It seems that for the archived logs to ship from the source DB to the shared drive, we need the listener on the port 1521 up and running. (Correct me if I am wrong ).
    My question is:
    Is there a way to shutdown a listener to prevent users from accessing the DB and have another listsener up so that the archived logs can be shipped to the shared drive ? If so, can you please give the details/example ?
    We asked the same question to the DBA of the source DB and we were told that it could not be done.
    Thanks in advance.

    Make sure that the dblink "using" clause is referencing a service name that uses a listener that is up and operational. There is no requirement that the listener be on port 1521 for Streams or for shipping logs.
    Chapter 4 of the 2Day+ Data Replication and Integration manual has instructions for configuring downstream capture in Tutorial: Configuring Two-Database Replication with a Downstream Capture Process
    http://docs.oracle.com/cd/E11882_01/server.112/e17516/tdpii_repcont.htm#BABIJCDG

  • How to edit a captured custom .wim file?

    Hi folks,
    I am in process of gaining experience with capturing and deploying Windows 7 SP1 from our WDS server using unattend file.
    Everything is working, since we installed our applications, ran sysprep, and captured a custom .wim file.
    Now we deploy it using unattend file and everything works.
    I am trying to figure out if there is a best practice here for adding applications to this new custom .wim file?
    I've tried deploying this new image to a unit, installed app, and run sysprep but fails with this error:
    A fatal error occurred while trying to sysprep the machine.
    I am thinking this has to do with multiple syspreps or something?
    I am trying to avoid having to use a default install.wim and app install automation with MDT or something like that.  We are strict to WDS only right now.
    Appreciate any comments,
    Thanks!  romatlo

    Offline servicing allows for adding/removing drivers and updates in MSU/CAB files.
    I do not use MDT either. If you want to "install" a new application to an image that is "out of arms" then you will need to install it after deployment. If your image has an answer file, you can mount your image. Then copy the installation files to it or
    you can keep it on a network share. Then in the answer file, you can put in the commands to install it using FirstLogonCommands.
    I typically like to reference a .cmd file in my unattend that I can add/remove commands to finalized images. So if a client wants to install a new program but doesn't want to recreate the image, then I can have it installed that way.
    <FirstLogonCommands>
    <SynchronousCommand wcm:action="add">
    <Order>1</Order>
    <Description>Set up Service Account</Description>
    <CommandLine>cmd /C start /wait c:\folder\FirstLog.cmd</CommandLine>
    </SynchronousCommand>
    </FirstLogonCommands>

  • Flash recovery area: archivelog file directory modification time

    Using Oracle Database 10g Release 10.2.0.3.0 - 64bit (Standard Edition) on Solaris 10, SunOS 5.10 Generic_118833-33, archivelog files in the Flash Recovery Area are still present, after the time that they should have been made redundant and therefore deleted::
    bvsmdb01-oracle10g $ date
    Thu Jan 24 16:04:46 GMT 2008
    bvsmdb01-oracle10g $ ls -lt archivelog
    total 20
    drwxr-x--- 2 oracle10g oinstall 1024 Jan 24 16:00 2008_01_24
    drwxr-x--- 2 oracle10g oinstall 512 Jan 23 23:30 2008_01_19
    drwxr-x--- 2 oracle10g oinstall 1536 Jan 23 23:04 2008_01_23
    drwxr-x--- 2 oracle10g oinstall 1536 Jan 22 23:30 2008_01_18
    drwxr-x--- 2 oracle10g oinstall 1536 Jan 22 22:53 2008_01_22
    drwxr-x--- 2 oracle10g oinstall 1024 Jan 21 23:07 2008_01_21
    drwxr-x--- 2 oracle10g oinstall 512 Jan 20 22:20 2008_01_20
    bvsmdb01-oracle10g $
    The archivelog directory for 2008_01_19 has a modification time of Jan 23 23:30 - this is almost 4 days after it was last written to.
    The current redundancy setting in RMAN is shown in the output of show all:
    RMAN> show all;
    using target database control file instead of recovery catalog
    RMAN configuration parameters are:
    CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 3 DAYS;
    CONFIGURE BACKUP OPTIMIZATION OFF; # default
    CONFIGURE DEFAULT DEVICE TYPE TO DISK; # default
    CONFIGURE CONTROLFILE AUTOBACKUP ON;
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO '%F'; # default
    CONFIGURE DEVICE TYPE DISK PARALLELISM 1 BACKUP TYPE TO BACKUPSET; # default
    CONFIGURE DATAFILE BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default
    CONFIGURE ARCHIVELOG BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default
    CONFIGURE MAXSETSIZE TO UNLIMITED; # default
    CONFIGURE ENCRYPTION FOR DATABASE OFF; # default
    CONFIGURE ENCRYPTION ALGORITHM 'AES128'; # default
    CONFIGURE ARCHIVELOG DELETION POLICY TO NONE; # default
    CONFIGURE SNAPSHOT CONTROLFILE NAME TO '/u01/app/oracle/product/10.2.0/dbs/snapcf_kierli.f'; # default
    So, the current retention policy is three days. It is 24/1 today - the archivelog directory from 19/1 should have been deleted by now, but it has not been.
    The modification time for the archivelog directory 2008_01_19 has a modification time of Jan 23 23:30 2008.
    Why is this? How can this be investigated? Does anyone have any suggestions?
    Thanks
    Jerry Mander

    From 2 Day DBA:
    Even after files in the flash recovery area are obsolete, they are generally not deleted from the flash recovery area until space is needed to store new files. As long as space permits, files recently moved to tape will remain on disk as well, so that they will not have to be retrieved from tape in the event of a recovery.
    What is the current space used in the FRA and what is the FRA disk limit ?

  • How to capture the excel file name

    Hi friends,
    when i execute my program one excel sheet is getting generated.for this i have used a fm 'SAP_CONVERT_TO_XLS_FORMAT'.i want to capture that excel file name and i want to pass it to next function module as input.bcz i should not do any hard coding like passing the filename directly into the function module.
    anybody have any suggestion for this.

    Hi,
    Here is the piece of code ....
    I hope this will add some help to u...
    FORM convert_xls_itab.
      break devuser.
      CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
        EXPORTING
          i_field_seperator    = 'X'
          i_line_header        = 'X'
          i_tab_raw_data       = wa_tab
          i_filename           = p_file
        TABLES
          i_tab_converted_data = it_data
        EXCEPTIONS
          conversion_failed    = 1
          OTHERS               = 2.
      IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
    EXPORTING
    FILENAME = P_FILE
    I_BEGIN_COL = P_BEGCOL
    I_BEGIN_ROW = P_BEGROW
    I_END_COL = P_ENDCOL
    I_END_ROW = P_ENDROW
    TABLES
    INTERN = IT_INTERN.
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    **--- Perform to move the data into an internal data
    ENDFORM. " CONVERT_XLS_ITAB
    Thanks & Regards
    Ashu Singh

  • How to capture using Pinnacle 700-USB Analog to Digital (USB)?

    I have a Pinnacle 700-USB Analog to Digital converter which inputs Composite and L+R Audio and outputs a Digital signal through USB.
    I'm using this to digitize VHS tapes.
    I am trying out different Video Editing sw including Adobe Elements 7. Others like Pinnacle Studio 12 and Corel Video Studio 12 can import from this USB-device. It doesn't seem to be possible using Adobe Elements 7.
    I've looked at the page http://help.adobe.com/en_US/PremiereElements/7.0/WS51F6C811-8B79-4c26-B4B9-24C0919182B6.ht ml
    It mentions at the end "Note: If you capture using an AV DV converter, you might need to capture without using device control."
    1) What does the last Note mean? Which device control?
    2) Are there any possibility of this working? What should I try to play with?

    My Pinnacle Dv500 came bundled with Premiere 6.0 which I updated to 6.02... and I also updated the Dv500 driver software from the V3 to V4.5a so it would save in "standard" DV AVI Type 2 48khz files, instead of the ones that required the Dv500 codec to edit
    When I bought a new computer and started using WinXp instead of the Win2000 that came with my Alienware Pentium3, I could never get Premiere 6 to work... would start, flash a screen or two, and then just go away
    I then bought Scenalyzer, since it was compatible with the Dv500 (still had to have P6 and Dv500 drivers installed) but it turned out to be very FRAGILE as any slightest glitch on the tape would stop the program... and since some of what I'm doing is capturing my OLD library of VHS tapes, that simply wouldn't do
    My current solution is a dual boot drive with Win2k/WinXp
    Premiere 6 captures EVERYTHING in the Win2k partition... files saved to 2nd drive... and then I use PProCS3/Encore3 for editing and DVD creation
    Just a "bit" of a hassle to have to restart between capture & edit/dvd... but much better than Scenalyzer stopping every time an old movie hit a rough spot in the tape... rough spots that capture just fine with P6

  • I have my photos  iPhoto but my iPhoto library is empty. when clicked message reads could not be opened image capture cannot open files in the photo library format. How do I get my photos back into my library?

    I have my photos in  iPhoto but my iPhoto library is empty.When clicked message reads could not be opened image capture cannot open files in the photo library format. How do I get my photos back into my library?

    When what is clicked?
    There are 9 different versions of iPhoto and they run on 9 different versions of the Operating System. The tricks and tips for dealing with issues vary depending on the version of iPhoto and the version of the OS. So to get help you need to give as much information as you can. Include things like:
    - What version of iPhoto.
    - What version of the Operating System.
    - Details. As full a description of the problem as you can. For example, if you have a problem with exporting, then explain by describing how you are trying to export, and so on.
    - History: Is this going on long? Has anything been installed or deleted? - Are there error messages?
    - What steps have you tried already to solve the issue.
    - Anything unusual about your set up? Or how you use iPhoto?
    Anything else you can think of that might help someone understand the problem you have.

  • Image Capture creating zip files instead of regular files

    I've been using Image Capture for many years. For some reason, all of a sudden, all my scans are being put into a zip file instead of as a regular file. I'd like to have Image Capture make regular files again such as pdf etc.  What do I do?
    Does this have anything to do with going to OS Yosemite 10.10?

    Hi
    appzapper & such are pretty flaky at removing things usefully, imo.
    Use the uninstaller for speed download - http://www.yazsoft.com/products/speed-download/faqs/?how-to-un-install-speed-dow nload-properly
    or get a hold of FindAnyFile or easyfind & search for Growl & Speeddownload & yazsoft - but from all I hear, the uninstaller works fine.
    Failing that, Safari's settings plist file isn't in caches
    Home/Library/Preferences/com.apple.safari.plist
    is the place
    & if you're still stuck, test & maybe download another browser using a New User Account.

Maybe you are looking for