Python + Miro from testing lead to corrupted database

This update today was a killer. Miro just blew up and took all my channel metadata along with it to the tune of 'corrupted database'.
It doesn't work even when I delete the .miro folder from my home. The thing errors out when I try to close it and stays open...
OT: I've had a fair share of problems with Miro so I think it's time to look for something less bloated and less fragile. Any ideas?

user3006396 wrote:
Hi Experts,
On one of the servers with Windows Server 2008, we have 5 Oracle 11g databases installed for our dev & testing environments.
With the help of oracle client we are connecting to those server db's.
Last Friday,for one of the db's, I have populated a million records to few tables and everything was fine on that day.
Suddenly on this Monday, one of the front-end users complained that the db was not working. We are not able to connect to that db, but able to connect to one of the other db's on the same server.
Message thrown was :
" ORA-12514: TNS:listener does not currently know of service requested in connect
descriptor "
I connected to server and tried to login from there..the same message was thrown, but able to connect to other db's.
Finally I set the oracle_home path to the db to which i was not able to login and connected using
something like....
conn /as sysdbaidle instance
startup;database mounted
database started
After that i was able to connect to that db as well as all other db's on the server.
My doubt is, why suddenly that stopped working and it didn't work until i use "startup" command ?
Also, how DBA's maintain server's with multiple oracle_home's?(though recommended to have multiple schema's instead of multiple db's)
I'm the only Oracle guy in the team and need to take care of everything(basically i'm developer and new to DBA related tasks)
My lead's warned me to make sure that it won't happen again !!
Thanks guys.. you are always awesome in giving nice suggestions.find alert_SID.log file post excerpt showing entries from period just prior to the DB STARTUP.
it appears the instance went missing.
Hopefully the log file will have clues as to why it went away.

Similar Messages

  • Have anybody copied Queries from "Test" database to "Production database?

    I am looking for the best method to copy queries from one database to another.  From TEST to PRODUCTION.  Other than Copy Express or Copy/Past, is there another recommended approach?  Such as within SQL?

    Hi Alain,
    Copying anything from database A to database B using SQL, means somehow using  SQL command "Insert". Please note that commands like Insert, Update, Delete are NOT SUPPORTED as it might cause data corruption.
    Kind regards
    Mario

  • Data recovery from Datafiles of corrupted database ?

    Hi all
    Actullay one of my databse(Oracle 9i) is corrupted and i dnt have .dmp files . so i inatlled the new databse.
    before that i copied the oradata folder from prevous database( Is not running right now)
    is there any way to restore the prevoius data from the files that i have in oradata folder to the new instance.
    thanks & regards
    Vivek

    Has the database - even when it was corrupted - been shutdown cleanly ?
    If yes, and only if you have all the files, it should be startable.
    If any of the datafiles is corrupted and it prevents database startup, you can always try following:
    startup mount
    alter database datafile <filename> offline drop;
    If the database is/was in archivelog mode you can omit the "drop" and only put it offline.
    As soon as all corrupted datafiles are put offline, including all not corrupted from the same tablespace, the database should be able to open.
    Keep in mind that this is only true for datafiles not belonging to SYSTEM/SYSAUX or other Instance dependant tablespaces;
    I don't think you can restore data from corrupted datafiles, unless you're able to recover them

  • Insert into all tables on a Database on Test Server select from all tables on a database A from production server

     
    hi Friends ,i need a suggestion from  you on how to
    insert data to all tables on a Database  "A " on Test server
    Select data from all tables on  a Database  "A" on Production Server
    where id=123
    Database A is same with Structures on Test and Production also all Tables  will have  Id column in common.
    The purpose of this insert is ,as we all know Production has the latest data and i need to push to test server on request for particular ID only  ( may be weekly once or  twice a week )
    I  have a linked server setup name "LINQ" 
    Example for one table is below , like wise i need a script which does for 154 tables.
    Insert into ABC( id, name)---insert to test server
    Select Id, name  from  LINQ.ProdSerevrname.databasename.ABC where id = 123
    Please help me ..
    Thanks

    Why not use export import wizard for this if you've read access to production?
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • How to apply PO template from test database to live database?

    I have created a PO template in my test database and I want to apply to the live database. Is any way to accomplish this task? i.e. any import or export method between different database? (Test environment or live environment?)

    First go to your TEST database.  Open your PO template in the design mode.
    Go to Print Layout Designer menu --> Display document properties --> General tab and copy the document ID (example: POR200....)
    Please Note:   
    LIVE = [Your LIVE DB NAME] and
    TEST = [Your TEST DB NAME]
    USING SQL
    INSERT INTO [LIVE].[dbo].[RDOC]
    SELECT * FROM [TEST ].[dbo].[RDOC] WHERE [TEST ].[dbo].[RDOC].DOCCODE = 'POR20...'   -
    (document ID copied above)
    INSERT INTO [LIVE].[dbo].[RITM]
    SELECT [TEST].[dbo].[RITM].* FROM [TEST].[dbo].[RITM]
    WHERE [TEST].[dbo].[RITM].DocCode = 'POR20...'   -
    (document ID copied above)
    This should work.
    Best Wishes

  • Migrating from Test Database to Production Database

    Hi,
    I have a situation where I created my Portal Pages on test database and now, I want to move everything from test database to my production database. The portal version on both the sites is same (3.0.9.8). I have content areas and applications in my existing test site. What is the best and easiest way to achieve this?
    Thanks in advance.
    Regards,
    Jatinder

    Hi,
    the easiest way is to make an export of the portal schema en import it into you're production database.
    greetings
    <BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Jatinder:
    Hi,
    I have a situation where I created my Portal Pages on test database and now, I want to move everything from test database to my production database. The portal version on both the sites is same (3.0.9.8). I have content areas and applications in my existing test site. What is the best and easiest way to achieve this?
    Thanks in advance.
    Regards,
    Jatinder<HR></BLOCKQUOTE>
    null

  • How to configure Crystal Report from Test DB to Production DB

    Hi,
    I have a VB/asp.net application that is currently using Oracle DB.
    Now, I'm creating a new Crytal report that will connect using any userID, password, database, servername on runtime.
    When I created the Crystal Report I used the Oracle OLE/DB connection using the test database logon information now I would like to make this  dynamic at form load event which I have my crystal report viewer configured and then pass the dataset regardless if it is coming from test or production database.
    Now, at form load connection using the test database I can see the report properly which I provided with the same logon information when the crystal report was created but when I switch the login using the production database, it doesn't get the correct datasource and logon information and it still uses the test datasource.
    I would appreaciate if you can give me a solution or any links of the previous threads that has similar issue with resolution or any sample that may help.
    Thank you,
    Ryan

    Hi Ryan
    I'd start with these samples:
    csharp_win_dbengine.zip / vbnet_win_dbengine.zip
    csharp_win_subreport_logon.zip / vbnet_win_subreport_logon.zip
    Link to the samples and others is in this wiki: Crystal Reports for .NET SDK Samples - Business Intelligence (BusinessObjects) - SCN Wiki
    Next, I find the Crystal Reports for Visual Studio 2005 Walkthroughs to be an absolute gem (and it applies to all versions of CR and VS).
    And of course, the developer help files:
    SAP Crystal Reports .NET SDK Developer Guide
    SAP Crystal Reports .NET API Guide
    Now, one thing you don't mention is which APIs you are using; CR, or RAS - both of which may be available to you depending on the version of CR and VS you are using... RAS Developer Help files are here:
    Report Application Server .NET SDK Developer Guide
    Report Application Server .NET API Guide
    The RAS APIs are a bit more complicated, but way more powerful. And there is even a utility that will write the DB logon code for you:
    KBA: 1553921 - Is there a utility that would help in writing database logon code?
    Finally, don't forget to search this SCN Space. Search box is in the top right corner. I find using short search string is best. E.g.; 'crystal net logon' will return a number of KBAs (notes), wikis, blogs, etc., etc.
    - Ludek
    Senior Support Engineer AGS Product Support, Global Support Center Canada
    Follow us on Twitter

  • [Forum FAQ] How to configure a Data Driven Subscription which get multi-value parameters from one column of a database table?

    Introduction
    In SQL Server Reporting Services, we can define a mapping between the fields that are returned in the query to specific delivery options and to report parameters in a data-driven subscription.
    For a report with a parameter (such as YEAR) that allow multiple values, when creating a data-driven subscription, how can we pass a record like below to show correct data (data for year 2012, 2013 and 2014).
    EmailAddress                             Parameter                      
    Comment
    [email protected]              2012,2013,2014               NULL
    In this article, I will demonstrate how to configure a Data Driven Subscription which get multi-value parameters from one column of a database table
    Workaround
    Generally, if we pass the “Parameter” column to report directly in the step 5 when creating data-driven subscription.
    The value “2012,2013,2014” will be regarded as a single value, Reporting Services will use “2012,2013,2014” to filter data. However, there are no any records that YEAR filed equal to “2012,2013,2014”, and we will get an error when the subscription executed
    on the log. (C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\LogFiles)
    Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter 'Name' is not a valid value.
    This means that there is no such a value on parameter’s available value list, this is an invalid parameter value. If we change the parameter records like below.
    EmailAddress                        Parameter             Comment
    [email protected]         2012                     NULL
    [email protected]         2013                     NULL
    [email protected]         2014                     NULL
    In this case, Reporting Services will generate 3 reports for one data-driven subscription. Each report for only one year which cannot fit the requirement obviously.
    Currently, there is no a solution to solve this issue. The workaround for it is that create two report, one is used for view report for end users, another one is used for create data-driven subscription.
    On the report that used create data-driven subscription, uncheck “Allow multiple values” option for the parameter, do not specify and available values and default values for this parameter. Then change the Filter
    From
    Expression:[ParameterName]
    Operator   :In
    Value         :[@ParameterName]
    To
    Expression:[ParameterName]
    Operator   :In
    Value         :Split(Parameters!ParameterName.Value,",")
    In this case, we can specify a value like "2012,2013,2014" from database to the data-driven subscription.
    Applies to
    Microsoft SQL Server 2005
    Microsoft SQL Server 2008
    Microsoft SQL Server 2008 R2
    Microsoft SQL Server 2012
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    For every Auftrag, there are multiple Position entries.
    Rest of the blocks don't seems to have any relation.
    So you can check this code to see how internal table lt_str is built whose first 3 fields have data contained in Auftrag, and next 3 fields have Position data. The structure is flat, assuming that every Position record is related to preceding Auftrag.
    Try out this snippet.
    DATA lt_data TYPE TABLE OF string.
    DATA lv_data TYPE string.
    CALL METHOD cl_gui_frontend_services=>gui_upload
      EXPORTING
        filename = 'C:\temp\test.txt'
      CHANGING
        data_tab = lt_data
      EXCEPTIONS
        OTHERS   = 19.
    CHECK sy-subrc EQ 0.
    TYPES:
    BEGIN OF ty_str,
      a1 TYPE string,
      a2 TYPE string,
      a3 TYPE string,
      p1 TYPE string,
      p2 TYPE string,
      p3 TYPE string,
    END OF ty_str.
    DATA: lt_str TYPE TABLE OF ty_str,
          ls_str TYPE ty_str,
          lv_block TYPE string,
          lv_flag TYPE boolean.
    LOOP AT lt_data INTO lv_data.
      CASE lv_data.
        WHEN '[Version]' OR '[StdSatz]' OR '[Arbeitstag]' OR '[Pecunia]'
             OR '[Mita]' OR '[Kunde]' OR '[Auftrag]' OR '[Position]'.
          lv_block = lv_data.
          lv_flag = abap_false.
        WHEN OTHERS.
          lv_flag = abap_true.
      ENDCASE.
      CHECK lv_flag EQ abap_true.
      CASE lv_block.
        WHEN '[Auftrag]'.
          SPLIT lv_data AT ';' INTO ls_str-a1 ls_str-a2 ls_str-a3.
        WHEN '[Position]'.
          SPLIT lv_data AT ';' INTO ls_str-p1 ls_str-p2 ls_str-p3.
          APPEND ls_str TO lt_str.
      ENDCASE.
    ENDLOOP.

  • Follow-on documents are not visible in MIRO in TEST server, awsys = PRD300.

    Dear Experts ,
    The Test Server was refreshed around mid-June 2011 with data of
    Production Server. The follow-on documents are not visible for the
    invoice documents in MIRO in TEST server due to value in tables
    BKPF,RBKP : field : AWSYS = PRD300 .
    The newly created Purchase Orders after the refresh, the accounting
    documents can be seen for the Goods Receipt (MIGO_GR-display) and
    Invoice documents (MIRO).
    we had already raised this issue in March & got the feedback from SAP
    as shown below.
    accordingly we have developed & run the program "zzlogsys2" which
    updates the Logsys/Awsys field from PRD300 ( of production server ) to
    that of the current server i.e. TST300 as required.
    after which the FI documents for the material documents are visible in
    MIGO,but follow-on documents are not visible for the invoice documents
    in MIRO.
    we have Checked notes 781498 and 28958 to see if the logical system is
    correctly assigned , where we found that in table RBKP after entering document number, Fiscal Year , the
    Field AWSYS is "PRD300" & not "TST300" as it should be. We will take up the activity of updating table RBKP
    also as we are currently doing for tables MKPF & BKPF.
    But to have clear picture as to what we are doing is correct , pl
    advise about the following :
    1) Is it a correct process done by our SAP-Basis team , that every time
    any Server ( e.g. Test or Quality ) is refreshed with Production server
    data, the Field AWSYS in various transaction tables gets value as
    "PRD300" which then is required to replaced by running program such as
    ZZlogsys.
    REPORT ZZLOGSYS.
    TABLES: T000, MKPF.
    DATA: NEW_SYS LIKE MKPF-AWSYS.
    PARAMETER: OLD_SYS LIKE MKPF-AWSYS.
    SELECT SINGLE * FROM T000 WHERE MANDT EQ SY-MANDT.
    NEW_SYS = T000-LOGSYS.
    CHECK NOT NEW_SYS IS INITIAL.
    UPDATE MKPF SET AWSYS = NEW_SYS
    WHERE AWSYS = OLD_SYS.
    WRITE:/ 'Number of updates: ', SY-DBCNT.
    2) if the above process is correct & normal , then which are the other
    tables in a particular server ,apart from tables MKPF,BKPF,RBKP , which
    needs to be updated the value of field "AWSYS" in the same way
    replacing value "PRD300".
    3) if the process in point no. 1 is not correct , then what is the
    correct process that the Basis team can do while refreshing any target
    server with production data so that target server retains its value in
    Field AWSYS & not showing "PRD300".
    With 3 servers TEST,DEV & Quality , recently refreshed with production
    server to bring all servers in Sync for a HR patch application, we have
    this situation now in all 3 servers .
    Thanks in advance ,
    Anil Shanbhag

    It is appropriate to move this thread from ERP-MM to [Enterprise Resource Planning (ERP)|Enterprise Resource Planning (SAP ERP);
    Edited by: Jeyakanthan A on Jul 7, 2011 4:56 PM

  • Steps for Go Live for Webtools moving from Test to Production

    Further to the useful checklist in the wiki recently supplied by Bryce can I ask for some minor gaps to be filled ( perhaps by inserting a few additional steps in that document)
    I have yet to hit the issues created by a go live implementation but this is only a matter of time...
    I am trying to work out what needs  to occur between having :
    a) a test B1 database successfully synching to a test WT site and db set up
    and
    b) the Production B1 database initially synched to the the live WT live WT db ready to accept the first live transactions.
    1) Copy WT site from test to live location
    2) Change Server config, settings and Tables tabs in Synch Manager click on 'Install Plugin' to add custom fields to Production B1 database
    3) ......
    I am stuck as to what needs to be done to "reset"  the WT back end db and how one goes about this.
    Coupled with this and probably part of the answer is the 'Initialise Synch' button. It would be useful to understand exactly what this does in terms of data  - presumably it is only transactional data - but which tables are affected?
    Also if there's anything else that I might have overlooked in terms of potential pitfalls I'd be grateful for advice - the recent posts regarding product trees and images come to mind.
    Thanks

    First thing is to plan for some downtime in your B1 databases. Nothing more frustrating than having new data come into b1 when you're trying to set this up. So perhaps do this at night or on a weekend when no one is using B1.
    Install Plugin operation adds UDF's, edits the stored proc in b1 db, and creates the queue table, PRX_Transaction_Queue
    Initialize synch runs upgrade scripts(if applicable), deletes all data that has synched from B1 previously(or would synch from Wt to B1, like a test order created in Wt) and inserts all relevant data into the queue table in B1 db. Also the synchid's are reset. It's pretty much the same list as is displayed on the Settings page of the Synch Manager. Please someone correct me if I'm missing something here!
    Pre-requisites:
    - Name the Wt db relevant to its position in the environment. B1Webtools is not the greatest name, it's meant to be a jumping off point. Rename your Wt db in the way that you would for B1. ie WebtoolsLive, WebtoolsTest. Makes things easier
    Two options:
    1) Presuming you have a test Wt db and a test b1 db or even a test Wt db and a production(live) B1 db, you could duplicate the test Wt db and change the synch manager config to point to the B1 live db. Install the "plugin" on the new B1 live db and enter your table mappings. Initialize synch will delete all the data that has synched from B1 test db to Wt db(now live db) and reset the synch id's to zero. These will be populated during the synch. Then Run Synch.
    Caveat: This option is fine if your synch takes a "short" period of time. Short being an hour. If you have say 20,000 business partners with 3 years of order history and 300 lines per order... expect several hours.
    2) Create a test synch profile and a live synch profile in tandem and update only the live synch profile with data. Also, have two web tools websites but work primarily in the live one. The synch manager will auto synch both profiles every time the service runs(set by you)
    When it's go time, copy the live db's over the test db's so you have a test environment exactly matching your live environment.  Now, obviously as soon as new data goes into the live db it's out of synch, but this way you have a test system you can break or use to test upgrades, etc.
    As you might guess, there are a number of ways to go with this. It really does depend on a) how comfortable you are with moving data around in SQL and b) what your setup entails.
    I think the best thing you can do is make it as simple as possible. Don't overthink things and make sure you have a backup of everything before you start.
    Good luck!

  • Filter corrupts database

    Hi - has anyone had issues creating a filter (on total float of a baseline specifically) that has corrupted / locked out the database?

    You can try the following solution. We had this happen at a client a few months ago. Not sure what version you're on, so solution could be different.
    Open the layout known not to cause the error and choose View, Layout, Save As.
    Provide a new name, different from the name of corrupt layout.
    Set the layout up to look the same as the original layout based on the previously noted settings
    Delete the corrupt layout

  • Error in extracting data from SAP using mySQL server database

    Hello Experts.
    We are now using MySQL server 2005 database for Nakisa OrgChart. We have already configured the SAPExtractor settings. Source-SAP has been provided and Test connection to destination database is successful. We manually created the database ExtractedData and AnalyticData in the MySQL management studio.
    We are getting the error upon starting the extraction
    Processing Function Read Table Function/BAPI Downloading Tables Organizational Assignment
    No Tables were downloaded for Read Table Function/BAPI . Function is flagged as critical.Terminating Extraction.
    Processing Stopped ! ! !
    Downloading from SAP completed.
    Processing Completed. 
    From CDS.log
    ERROR: Sap Authentication : Source {SAP.Connector}: Message {An attempt was made to load a program with an incorrect format. (Exception from HRESULT: 0x8007000B)}
    We would appreciate some assistance on this configuration
    Thanks,
    Angelo
    Accenture inc.
    SAP Basis

    Hello Luke,
    We have not done any modification of the downloadschema file. We receive message: The XML page cannot be displayed
    Cannot view XML input using style sheet. Please correct the error and then click the Refresh button, or try again later.  when we open the file.
    The SAP account we used has SAP_AlLL, SAP_NEW authorizations. Are these authorizations sufficient to perform the data extraction?
    Thanks,
    Angelo

  • Strange behavior from script when dismounting content databases via PowerShell script

    So, this is my first time posting a question here.  I have taught myself PowerShell and for the most between friends, co-workers, or internet forums, I have been able to pick through most of the issues I have seen.  This one has me stuck. 
    So, a little background, I have a script that goes through each web app in a farm, finds the database that hosts the root site, and detaches all other content databases from the farm.  It also writes a csv file so that a process can come along after the
    fact to re-attach the databases.  I use this for patching so psconfig does not take quite as long as we have more than 50 content databases.  Here is the odd part.  After the first content database is detached, the second one is skipped, but
    the remainder are detached.  So, I have two databases left per web app.  One that holds the root site and whatever the second database is after the first detach.  If I comment out the line that actually does the detach, this does not happen. 
    Now, using ISE and adding break points, what I have found is that when I initially populate the array with the content databases ($var=$webapp.contentdatabase) all databases are in there.  When I start the foreach, all databases are in there.  Right
    up to the point that I detach the first database, all databases are still in the array variable.  As soon as I detach that database, the database that I detached gets removed from the array.  It does this even if I set the variable to read only. 
    It only happens with the first database detached in each web app.  I will try to explain this again with an example.  This test farm has 4 databases.  DB1, DB2, DB3, and DB4.  DB1 holds the root site.  The logic pulls the webapp info. 
    It gets the URL of the webapp.  It takes that info and finds the site collection with that name (the root site).  Finds the database name from there.  It then gets a list of all databases in the webapp and starts to loop through.  So the
    first one in the list is DB1.  It has the root site, so logic says do not delete.  It comes back to the loop with the next database DB2.  At this point, the array still has all 4 databases in the array.  DB2 does not host the root site,
    so the logic gathers information and calls dismount-spcontentdatabase.  As soon as that is called, that database is removed from the array.  So, at this point, we have processes postion 0 and position 1 in the array.  Since the original position
    1 has been removed, DB3 is now at postion 1.  So, the loop continues.  Sine we have now skipped DB3, we find DB4 and dismount.  Now, we have in the array DB1,DB3, and DB4.  My test environment has more and all are still there except for
    the first one removed.  I have banged my head against the desk with this one.  If someone has some insight, I would appreciate it.
    Here is the code that does it.
    get-spwebapplication | foreach-object {
     $webapp=$_
     $waname=$webapp.displayname
     $waurl=$webapp.url
     $cdbs=$webapp.contentdatabases
     $rsite=get-spsite $waurl -erroraction silentlycontinue
     $rsitedb=$rsite.contentdatabase
     $rsitedbn=$rsitedb.name
     foreach ($cdb in $cdbs) {
      $cdbs=$cdbs2
      $cdbname=$cdb.name
      #Write-Host "$cdbname" -foregroundcolor red
      if ($cdbname -ne $rsitedbn) {
       $warn=$cdb.warningsitecount
       $max=$cdb.maximumsitecount
       $current=$cdb.currentsitecount
       $dbserver=$cdb.server
       Write-output "$waname,$cdbname,$dbserver,$warn,$max,$current" >> $outfile
       Write-host "dismounting $cdbname" -foregroundcolor green
       dismount-spcontentdatabase -identity $cdbname -confirm:$false
      else {
       Write-host "$cdbname holds root site. No action." -foregroundcolor yellow

    what about keeping an array of all the non-root collections (may be database name) instead of dismounting them in the foreach loop of all the databases? After we have all the databases that need to be deleted, in a for loop starting with 0 and
    counting to the number of databases, check the name of the database for each index and dismount that database
    pseudo code would be
    $dbstobedismounted = @()
    foreach($cdb in $cdbs)
    $dbstobedismounted + = $cdb.Name  //instead of actually dismounting
    for(int i=0;i<$webapp.ContentDataBases.Count;i++)
    $dabasename = $webapp.ContentDataBases[i].Name
    foreach($dbtobedismounted in $dbstobedismounted)
    if($databasename -eq $dbtobedismounted )
    dismount-spcontentdatabase -dentty $webapp.ContentDataBases[i]
    I'm sure you might have to do some modifications to make it work, but the point I'm making is instead of dismounting in the foreach loop, have a list and dismount in another for loop. Hope it helps.
    rani

  • BerkeleyDB hangs trying to read corrupted database

    The provided corrupted database file causes reproducible hang in db->get method of Berkeley DB.
    Reproduced with Berkeley DB 4.8.30 and 4.3.29 under Linux.
    Corrupted database file: https://cfengine.com/bugtracker/file_download.php?file_id=123&type=bug
    Testcase:
    #include <db.h>
    int main()
    DB *dbp;
    DBT key, value;
    key.data = "x86_64";
    key.size = 6;
    db_create(&dbp, NULL, 0);
    dbp->open(dbp, NULL, "bdbtest.db", NULL, DB_BTREE, 0, 0644);
    dbp->get(dbp, NULL, &key, &value, 0);
    return 0;
    gdb backtrace:
    (gdb) bt
    #0 __memp_fget (dbmfp=0x602120, pgnoaddr=0x7fffffffe2ec, ip=<value optimized out>, txn=0x0,
    flags=<value optimized out>, addrp=<value optimized out>) at ../dist/../mp/mp_fget.c:293
    #1 0x00007ffff7aa94dc in __bam_search (dbc=0x603880, root_pgno=<value optimized out>,
    key=<value optimized out>, flags=<value optimized out>, slevel=<value optimized out>,
    recnop=<value optimized out>, exactp=0x7fffffffe41c) at ../dist/../btree/bt_search.c:753
    #2 0x00007ffff7a967a6 in __bamc_search (dbc=0x603880, root_pgno=<value optimized out>,
    key=<value optimized out>, flags=27, exactp=<value optimized out>)
    at ../dist/../btree/bt_cursor.c:2785
    #3 0x00007ffff7a98027 in __bamc_get (dbc=0x603880, key=<value optimized out>, data=0x7fffffffe5d0,
    flags=27, pgnop=<value optimized out>) at ../dist/../btree/bt_cursor.c:1088
    #4 0x00007ffff7b3e27f in __dbc_iget (dbc=0x0, key=<value optimized out>, data=0x7fffffffe5d0,
    flags=27) at ../dist/../db/db_cam.c:934
    #5 0x00007ffff7b4b250 in __db_get (dbp=<value optimized out>, ip=<value optimized out>,
    txn=<value optimized out>, key=0x7fffffffe600, data=0x7fffffffe5d0, flags=27)
    at ../dist/../db/db_iface.c:779
    #6 0x00007ffff7b4b57b in __db_get_pp (dbp=0x601a60, txn=0x0, key=0x7fffffffe600,
    data=0x7fffffffe5d0, flags=0) at ../dist/../db/db_iface.c:694
    #7 0x00000000004005df in main () at bdbtest.c:13
    --------

    Hi,
    Your database is corrupted, hence there is no surprise you cannot retrieve successfully from it:
    # db_verify -o cf_classes.db.duff
    db_verify: Page 4: btree or recno page is of inappropriate type 0
    db_verify: Page 4: totally zeroed page
    db_verify: Page 4: Btree level incorrect: got 0, expected 1
    db_verify: Page 3: unterminated leaf chain
    db_verify: cf_classes.db.duff: DB_VERIFY_BAD: Database verification failed
    Verification of cf_classes.db.duff failed.
    Dumping the database with db_dump -da shows the following where page 4 should have been:
    page 0: invalid: LSN [0][0]: level 0
         prev: 0 next: 0 entries: 0 offset: 0
    and a verification with a hex editor over the 4KB's of page 4, shows that this page is all 0s (zeros).
    You need to run recovery over this database. For backup and recovery procedures review the followings:
    - chapter 11 in the Berkeley DB Reference Guide, Berkeley DB Transactional Data Store Applications;
    - in the Getting Started with Berkeley DB Transaction Processing, the sections on Backup Procedures and Recovery Procedures.
    If your application is not using transactions (it is not set up as TDS, Transactional Data Store), you should review the section on Handling failure in Data Store and Concurrent Data Store applications.
    Regards,
    Andrei

  • I want to export few tables from 9.2.0.2  database and import to  10.2.0.1

    Dear All,
    I am newbie to oracle, my company wants to export few tables from 9.2.0.2 database to 10.2.02. ( they are using developer 2000 4.5/ and oracle reports 6i)
    The puropose of this activity is to use forms and reports of 9i into 10g to check if it works.
    The configuration of Test machine is core2duo processor, 2gb ram, 500gb hard disk.
    I have few questions, as i am newbie am little afraid, so please help me,
    what are the precautions i have to make before running export command on 9i database?
    What are teh precautions i have to make before running import command on 10g database?
    My manager wants me too export it with user,synonyms,grants,priveleges as it is production database? how to do that?
    How to check what are the grants and privileges of a users in 9i?
    My production database is on 9i and on solaris 10 i am going to install 10g database on windows xp as test instance, is it correct to do this way??
    My manager said i should use NLS_LANG as AMERICAN_AMERICA.AR8MSWIN1256 as a character set while installing database.
    Note:- I have installed oracle 10g from OTN.
    Please help me as early as possible.
    Regards,
    user9007339

    No, that it doesn't mean. You will have to use the exact right version at the right time. From lower version to higher you need to use source db exp tool and target db imp. From higher to lower you need to use target db exp and target db imp.
    In other words:
    Always use exp from lowest version and imp from target version!
    See Metalink, Note:132904.1

Maybe you are looking for

  • NTFS Partition Doesn't Allow Programs To Access It [SOLVED]

    Hello all, First of all, I have finally returned to Arch after several years of using Linux Mint. It feels good to be back in a bloatware-free environment, and I'm enjoying my 1 second boot times! Unfortunately, this hiatus has caused me to lose much

  • AMT subsystem failure error:1

    I have found a problem with CS4. After reinstal it doesn´t run and a window with AMT subsystem failure open. I tried remove it and reinstal it. Doesnt work and de instalation program show problems. What could I do to solve this?

  • Safari is not recording browsing history

    There is no history after a few weeks of browsing on a new install. What is going wrong. Check many times. Would be using Chrome but it's packed up since I upgraded from Mavericks to Yosemite.

  • HDD reads almost full but when I add file size it only 50%

    When I get into on my HDD its says it has 70GB used & 4GB unused, but when I go into finder to try to delete unneeded files I only see about 35GB. Where's the other half? I figured it illed it because I recently captured video, but I can find it to d

  • ODS-D55U Help

    Hi there, I have been using this unit for a couple of months now and seems to be going pretty well. However I have come across a stumbling block. I am trying to back up folders that equat to just over 1TB. I have a 1TB disc and it just doesnt want to