Migration of LONG and LONG RAW datatype

Just upgraded a DB from 8.1.7.4 to 10.2.0.1.0. In the post-upgrade tasks, it speaks of migrating tables with LONG and LONG RAW datatypes to CLOB's or BLOB's. All of my tables in the DB with LONG or LONG RAW datatypes are in the sys, sysman, mdsys or system schemas (as per query of dba_tab_columns). Are these to be converted? Or, does Oracle want us to convert user data only (user_tab_columns)?

USER_TAB_COLUMNS tells you the columns in the tables owned by the current user. There may well be many users on your system that you created that contain objects. I suppose you could log in to each of those schemas and query their USER_TAB_COLUMNS table, but it's probably easier to query DBA_TAB_COLUMNS with an appropriate WHERE clause on the owner of the objects.
Justin

Similar Messages

  • Word File Takes Longer and Longer to Save

    I’ve been working daily on the same 100-page document for several months. I had no problems with the document under Word 2003. However, under Word 2010 the file grows in size over time, and saving the file takes longer and longer to the point where there
    are significant timeouts (Word is “not responding”) whenever an automatic save occurs. 
    I don't want to diable automatic saves because Word does crash for me on rare occasions, generally if I do something "too fast".
    I’ve found a workaround for the problem, and every three weeks or so after the automatic saves have become painfully long I copy the document to the clipboard (except for the last paragraph mark) and then I open a new document based on the relevant template
    and I paste the clipboard contents into the new document. I rename the old version of the document and the new version becomes the working version. This reduces the file size (currently around 1.4 MB) by about 150 KB and the problem goes away for another three
    weeks.
    Certain aspects of my situation are unusual, and these may or may not be relevant to the problem:
    At the end of each day I use (via a macro) the Review, Compare feature of Word to compare the document with the previous day’s version to allow me to reread any changes I made to it.
    I use various other macros for intelligent page-turning, resizing windows, smart Find, etc.
    I maintain the document as a DOC file (Word 97-2003 Compatibility Mode) because I need to share the document with an organization that requires this format.
    The document flips back and forth a few times between being a one-column and two-column document.
    The document has a table of contents on the last page.
    The headings in the document have embedded section and subsection numbers.
    The document has numerous embedded SEQ and cross-reference fields.
    The document has embedded EMF pictures that were generated by a non-Microsoft application.
    The long times to save the file and the temporary solution I’ve found to the problem suggest that some "junk" is accumulating “in” the last paragraph mark. This junk doesn't cause any operational errors, but it slows things down to the point where
    the auto-save times out and I temporarily get the distracting "not responding" message. It would be nice if Word could automatically eliminate the junk in the last paragraph mark so that I wouldn’t have to do it manually.
    Do you have any suggestions for how I might eliminate the problem?
    I'd be pleased to send a copy of the slow-saving file to a Microsoft Word programmer for diagnosis of the problem.
    I have up-to-date Windows 7 professional (64 bit) and Word 2010 14.0.6129.5000 (32 bit).
    Thanks for your help,
    Don Macnaughton

    I am experiencing exactly the same save issue, although I cannot use the suggestion of copying to a new document as I have allot of references within the same document and I'm scared that I'll loose them (or mess them up).
    It is nearly a year later, did you have any luck?
    Francois,
    I'm still experiencing the problem. However, I've now converted the document from a DOC to a DOCX, but that made no difference.  So every 18 or so days I copy all of the document into a new document except for the last paragraph mark
    and the problem goes away for another 18 or so days.  For my document this solution is fully reliable although it's less convenient because it's a little complicated and I worry I may make a mistake or some text may be lost in the transition.
    So I'm still looking for a solution to the problem. Is there anything unique about your document or your handling of the document that might be the cause of the problem?  Are you using macros, Compare Versions, switching back and forth between
    one and two columns, or anything else that is common to the features that I list in my first post in this thread?
    You might want to try my copying solution as a test while keeping your original document as the official version that you continue to work with.  You could then check the test document very carefully to see if my solution works with your
    document.  You might find that you can trust my solution (or you might not). 
    By the way, I make sure that the copy worked properly by doing a Compare Versions of the old and new documents.  (Surprisingly, sometimes the compare finds very minor differences between the two documents, but usually not.)
    If the problem really bothers you, you can hire Microsoft Support, although that will cost you some money.  If you do that, please let us know the outcome.
    Don Macnaughton

  • Why do contacts name keep getting longer and longer?

    On some of my contacts, the name keeps getting longer and longer.
    For example, if my name is Jason Kahng.
    After a few weeks, when I lookup my name in Contacts it comes up as "Jason Kahng Jason Kahng Jason Kahng"
    This is very annoying. If I edit the name back down to Jason Kahng, it will do this again a few weeks later.
    Please let me know how to fix this problem.
    Thank you!

    Your iPhone uses Wi Fi to sync your contacts, not iTunes with iOS 5.0.1
    Tap Settings > iCloud
    Switch Contacts off then back on then reset the iPhone.
    Hold the On/Off Sleep/Wake button and the Home button down at the same time for at least ten seconds, until the Apple logo appears.

  • How can I reset my sons restrictions pass code? I have failed 9 attempts and it locks me out longer and longer.thx

    How can I reset my sons restrictions pass code? I have failed 9 attempts and it locks me out longer and longer.thx

    see here
    http://support.apple.com/kb/HT1212
    No choice

  • 3.x Workbooks taking longer and longer to run each week

    Hey all,  I have a user who has embedded 5 versions of the same query into a workbook.  He runs this workbook every monday.  When he first created the workbook it took 30 minutes to run.  Each week that goes by the workbook takes longer and longer to run and eventually gets to the runtime of 2 hours.  Periodically my user has to go and make a change to the workbook and after he recreates the workbook then it goes back to taking 30 minutes to run.
    Is there some kind of a buffer that is filling up that I don't know about?  Is there a way I can refresh the workbook so that the runtime doesn't creep like it is doing?
    Thanks
    Adam

    Guess I posted prematurely. Looking closer I realized there was a select happening during this process against a text column without an index. The slowdown was just the increasing cost of looping through the the entire dataset looking at strings that often shared a fairly sizable starting substring. Chalk another problem up to the importance of appropriate indexes in your db!

  • The pinwheel is spinning for longer and longer periods of time. Have to manually shut off iMac sometimes to get it to restart.

    The pinwheel is spinning  for longer and longer periods of time. Why and what do I need to do about that?

    You have 10.6 on that machine, I suggest you stick with it for performance, third party hardware and software reasons as long as possible.
    Consider 10.8 (not 10.7) when it's released, because 10.7 and 10.8 will require a new investment in software and newer third party hardware as it requires newer drivers the old machines won't have. (forced upgrade because of software, really nice of them)
    http://roaringapps.com/apps:table
    Far as your Safari problem do these things until it's resolved:
    1: Software Update fully under the Apple menu.
    2: Check the status of your plug-ins and update (works for all browsers) also install Firefox and see if your problems continue. You should always have at least two browsers on the machine just in case one fails.
    https://www.mozilla.org/en-US/plugincheck/
    Flash install instructions/problem resolution here if you need it.
    How to install Flash, fix problems
    3: Install Safari again from Apple's web site
    https://www.apple.com/safari/
    4: Run through this list of fixes, stopping with #16 and report back before doing #17
    Step by Step to fix your Mac

  • Startup process gets longer and longer

    Hi, my parents have an iMac (purchased in 09) running OS X 10.9.5. Over the years and especially lately, the time it takes to startup has gotten longer and longer.
    Any advice or suggestions would be much appreciated.

    Babowa,
    Below are the results of running etrecheck and I apologize for taking so long getting the results to you!  Thanks very much!
    Problem description:
    Startup process takes longer and longer (especially lately)!
    EtreCheck version: 2.1.5 (108)
    Report generated December 19, 2014 at 12:29:58 PM PST
    Click the [Support] links for help with non-Apple products.
    Click the [Details] links for more information about that line.
    Click the [Adware] links for help removing adware.
    Hardware Information: ℹ️
      iMac (20-inch, Early 2009) (Verified)
      iMac - model: iMac9,1
      1 2.66 GHz Intel Core 2 Duo CPU: 2-core
      4 GB RAM Upgradeable
      BANK 0/DIMM0
      2 GB DDR3 1067 MHz ok
      BANK 1/DIMM0
      2 GB DDR3 1067 MHz ok
      Bluetooth: Old - Handoff/Airdrop2 not supported
      Wireless:  en1: 802.11 a/b/g/n
    Video Information: ℹ️
      NVIDIA GeForce 9400 - VRAM: 256 MB
      iMac 1680 x 1050
    System Software: ℹ️
      OS X 10.9.5 (13F34) - Uptime: 0:4:4
    Disk Information: ℹ️
      Hitachi HDT721032SLA380 disk0 : (320.07 GB)
      EFI (disk0s1) <not mounted> : 210 MB
      Recovery HD (disk0s3) <not mounted>  [Recovery]: 650 MB
      Macintosh HD (disk1) / : 318.88 GB (275.79 GB free)
      Encrypted AES-XTS Unlocked
      Core Storage: disk0s2 319.21 GB Online
      HL-DT-ST DVDRW  GA11N 
    USB Information: ℹ️
      Apple Inc. Built-in iSight
      Apple Computer, Inc. IR Receiver
      Apple Inc. BRCM2046 Hub
      Apple Inc. Bluetooth USB Host Controller
    Gatekeeper: ℹ️
      Mac App Store and identified developers
    Adware: ℹ️
      Conduit [Remove]
    Startup Items: ℹ️
      HP IO: Path: /Library/StartupItems/HP IO
      Startup items are obsolete in OS X Yosemite
    Launch Daemons: ℹ️
      [loaded] com.adobe.fpsaud.plist [Support]
    User Launch Agents: ℹ️
      [running] com.zeobit.MacKeeper.Helper.plist [Support]
    User Login Items: ℹ️
      iTunesHelper UNKNOWNHidden (missing value)
      Skype UNKNOWN (missing value)
      Documents FolderHidden (/Users/[redacted]/Documents)
      HP Scheduler Application (/Library/Application Support/Hewlett-Packard/Software Update/HP Scheduler.app)
    Internet Plug-ins: ℹ️
      FlashPlayer-10.6: Version: 15.0.0.246 - SDK 10.6 [Support]
      QuickTime Plugin: Version: 7.7.3
      Flash Player: Version: 15.0.0.246 - SDK 10.6 Mismatch! Adobe recommends 16.0.0.235
      Default Browser: Version: 537 - SDK 10.9
      OfficeLiveBrowserPlugin: Version: 12.3.6 [Support]
      Silverlight: Version: 4.0.60129.0 [Support]
      iPhotoPhotocast: Version: 7.0
    3rd Party Preference Panes: ℹ️
      Flash Player  [Support]
    Time Machine: ℹ️
      Skip System Files: NO
      Auto backup: NO - Auto backup turned off
      Destinations:
      Dic's Thumb Drive [Local]
      Total size: 0 B
      Total number of backups: 0
      Oldest backup: -
      Last backup: -
      Size of backup disk: Excellent
      Backup size 0 B > (Disk size 0 B X 3)
    Top Processes by CPU: ℹ️
          54% Mail
          7% WindowServer
          4% HP Device Monitor
          4% NotificationCenter
          3% opendirectoryd
    Top Processes by Memory: ℹ️
      198 MB Mail
      112 MB mds_stores
      94 MB Microsoft Word
      82 MB WindowServer
      77 MB Safari
    Virtual Memory Information: ℹ️
      1.45 GB Free RAM
      1.63 GB Active RAM
      460 MB Inactive RAM
      483 MB Wired RAM
      422 MB Page-ins
      0 B Page-outs
    Diagnostics Information: ℹ️
      Dec 19, 2014, 12:26:32 PM Self test - passed

  • Oraclebinary  and log raw datatypes

    hi
    i have a table (oracle XE 10g) with a column LONG RAW data types, the column contains a XML file for XSLT tranformation.
    i want read th column and i write this code but it don't works!!!
    Private Sub leggiRiparto()
    Dim codiceProvincia As String
    Dim codiceComune As String
    Dim codtipoelezione As String
    'query per la lettura dalla tabella outputscrutini del Db elez
    Dim strSQL As New String("SELECT o.DATAELEZIONE, o.CODPROVINCIA,o.CODCOMUNE,o.CODTIPOELEZIONE, o.XML, c.DESCCOMUNE " & _
    "FROM outputscrutinio o, comune c " & _
    "WHERE o.dataElezione = " & dataElezione & " and o.codtipoelezione = " & tipoelezione & _
    " AND o.CODPROVINCIA = c.CODPROVINCIA " & _
    " AND o.CODCOMUNE = c.CODCOMUNE " & _
    "ORDER by o.codprovincia, o.codcomune")
    Dim riparti As OracleCommand ' New OracleCommand(strSQL, dbElezioniDDE)
    riparti = New OracleCommand(strSQL, dbElezioniDDE)
    Try
    'ApriDB()
    Dim dtRead As OracleDataReader = riparti.ExecuteReader
    If Not dtRead.HasRows Then
    MessageBox.Show("Attenzione non ci sono riparti disponibili ! Contattare l'amm.re del DB")
    Else
    While (dtRead.Read())
    codiceProvincia = dtRead.Item("CODPROVINCIA")
    codiceComune = dtRead.Item("CODCOMUNE")
    codtipoelezione = dtRead.Item("CODTIPOELEZIONE")
    Dim blob As OracleBinary = dtRead.GetOracleBinary(4)
    'Dim blob As OracleBlob = dtRead.GetOracleBlob(4)
    Dim indice As Integer = blob.Length
    Dim nomefile As String = "Riparto Seggi" & "-" & dataElezione & "-" & codtipoelezione & "-" & codiceProvincia & "-" & codiceComune
    Dim fileXML(indice) As Byte
    End While
    End If
    Catch ex As OracleException 'catch per eccezioni Oracle
    Select Case ex.Number
    Case 12545
    MessageBox.Show("Il database è al momento non disponibile.")
    Case Else
    MessageBox.Show("errore database: " & ex.Message.ToString)
    End Select
    Catch ex As Exception
    MessageBox.Show("Errore Apertura DB DDE : " & ex.Message & vbCrLf & "StackTrace : " & ex.StackTrace, "Errore", MessageBoxButtons.OK, MessageBoxIcon.Error)
    End
    Finally
    riparti.Dispose()
    'dbElezioniDDE.Close()
    'dbElezioniDDE.Dispose()
    End Try
    End Sub

    I'm sorry but the post was incomplete!
    Then the problem is when I read the XML data
    As OracleBinary dtRead.GetOracleBinary
    Dim blob As OracleBinary = dtRead.GetOracleBinary(4)
    Dim indice As Integer = blob.Length
    that by obtaining blob.Length = 0
    while the other fields in the select statement is accurate
    Thanks

  • SQLite Inserts Taking Longer and Longer

    The application I'm working on makes repeated calls to a webservice to get data that is then cached in a local sqlite db for the user. Once the db hits ~5mb it starts taking painfully long to run each set of inserts. Calls to the webservice remain quick. There had been an issue with XML not being garbage collected, but I fixed that and now the profiler shows consistent memory usage.
    I've tried running the inserts with indexes on and off. I've tried batching the inserts in transactions of 100, or the entire set. The db calls are synchronous.
    Running the queries against the database directly (not through the AIR application) suggest that there isn't a slowdown at the 5mb mark there, which is consistent with my experiences with SQLlite. Restarting the application and continuing to download data into an existing project does not resolve the issue, it starts off slow.
    So...does anyone have any ideas of other things to try to get insert performance up to a reasonable level? Has anyone else run into similiar issues? Is anyone inserting into 20mb+ dbs and not seeing degrading performance?
    Thanks for the help!

    Guess I posted prematurely. Looking closer I realized there was a select happening during this process against a text column without an index. The slowdown was just the increasing cost of looping through the the entire dataset looking at strings that often shared a fairly sizable starting substring. Chalk another problem up to the importance of appropriate indexes in your db!

  • Maintenance jobs taking longer and longer

    On 2 of our GroupWise servers the weekly maintenance jobs are taking much longer to run.
    What used to finish before I got in at 7:30am is now still running at 4PM right now on one of the servers. Last monday, they finished by noon.
    We are running GroupWise 8.03 on Netware.
    The two servers have about 300 Gig of data each.
    The issue of maintenance jobs running later started a few months ago, but today is the worst.
    Should I increase the GWWorker threads from to something higher? I think I may want to push the startup times earlier too.
    Other ideas?
    thanks
    Phil J

    Guess I posted prematurely. Looking closer I realized there was a select happening during this process against a text column without an index. The slowdown was just the increasing cost of looping through the the entire dataset looking at strings that often shared a fairly sizable starting substring. Chalk another problem up to the importance of appropriate indexes in your db!

  • Takes longer and longer to move between pages and files...little ball rolls and I cant move

    I got a new harddrive installed this year and it is difficult to move between pages or open or close anything instantly...waiting time varies...

    Let's establish some specifics:
    What model MBA do you have?
    How much RAM
    What HD? Original PATA drive, original SSD design or newer style SSD/RAM drive?
    The file(s) that we're working with, what are they? How large are they and what program(s) are we using in conjunction with them?
    We need to try and figure out if this is an application or an OS X issue.

  • Compressor times getting longer and longer...what is going on?

    I burn two game dvds per week about the same times (1hr-1hr 10min. I have been using compressor preset of 90 min best(mpeg-2) and then upping the bit rate to get 3.4 to 3.8 gb per dvd. The times to process are getting ridiculous. Normally it would take 2-3 hrs to process but in the last week the last 3 compressor processing times were 5hrs, 7hrs, and then today 22hrs. As far as I know I haven't changed anything. What could bee going on and where should I check first? Im' using FCS 2 Thanks!

    In case you haven't already, try some of the things outlined in the Troubleshooting Basics for Compressor.
    For your scenario, I would try to the Clear QMaster Cache first. If that doesn't seem to solve anything, try Repair File Permissions, then Delete Preferences.
    Or are those roads you've travelled already?

  • How to view contents in Long Raw datatype column

    Hi,
    We have two node RAC database with 10.2.0.4.0 version.
    OS - IBM AIX.
    We have a table with a column with datatype "LONG RAW" in production. It stores image files.
    We need to send the images from few rows to third party vendor. Basically, they need to view the images.
    Earlier, I have exported to dump file using datapump and sent to vendor. but vendor is telling that they are not able to view the images. Can you please suggest best method to transfer the images (LONG RAW datatype) and the method to view them.

    We have a table with a column with datatype "LONG RAW" in production. It stores image files.
    We need to send the images from few rows to third party vendor. Basically, they need to view the images.
    Earlier, I have exported to dump file using datapump and sent to vendor. but vendor is telling that they are not able to view the images. Can you please suggest best method to transfer the images (LONG RAW datatype) and the method to view them.How is the vendor trying to use the extracted images? Data exported with datapump must be imported into another database with datapump. The same applies to the exp utility (must use imp to load into a database).
    If you're careful you should be able to write a binary file using utl_file.
    Regarding the long raw, is there any way you could convert to BLOBS? Longs and Long raws are notoriously hard to work with

  • Mapping CLOB and Long in xml schema

    Hi,
    I am creating an xml schema to map some user defined database objects. For example, for a column which is defined as VARCHAR2 in the database, I have the following xsd type mapping.
    <xsd:element name="Currency" type="xsd:string" />
    If the oracle column is CLOB or Long(Oracle datatype), could you please tell me how I can map it in the xml schema? I do not want to use Oracle SQL type like:
    xdb:SQLType="CLOB" since I need a generic type mapping to CLOB. Would xsd:string still hold good for CLOB as well as Long(Oracle datatype) ?
    Please help.
    Thanks,
    Vadi.

    The problem is that LONGs are not buffered but are read from the wire in the order defined. The problem is the same as
    rs = stmt.executeQuery("select myLong, myNumber from tab");
    while (rs.next()) {
    int n = rs.getInt(2);
    String s = rs.getString(1);
    The above will fail for the same reason. When the statement is executed the LONG is not read immediately. It is buffered in the server waiting to be read. When getInt is called the driver reads the bytes of the LONG and throws them away so that it can get to the NUMBER and read it. Then when getString is called the LONG value is gone so you get an exception.
    Similar problem here. When the query is executed the CLOB and BLOB locators are read from the wire, but the LONG is buffered in the server waiting to be read. When Clob.getString is called, it has to talk to the server to get the value of the CLOB, so it reads the LONG bytes from the wire and throws them away. That clears the connection so that it can ask the server for the CLOB bytes. When the code reads the LONG value, those bytes are gone so you get an exception.
    This is a long standing restriction on using LONG and LONG RAW values and is a result of the network protocol. It is one of the reasons that Oracle deprecates LONGs and recommends using BLOBs and CLOBs instead.
    Douglas

  • Can't fetch clob and long in one select/query

    I created a nightmare table containing numerous binary data types to test an application I was working on, and believe I have found an undocumented bug in Oracle's JDBC drivers that is preventing me from loading a CLOB and a LONG in a single SQL select statement. I can load the CLOB successfully, but attempting to call ResultSet.get...() for the LONG column always results in
    java.sql.SQLException: Stream has already been closed
    even when processing the columns in the order of the SELECT statement.
    I have demonstrated this behaviour with version 9.2.0.3 of Oracle's JDBC drivers, running against Oracle 9.2.0.2.0.
    The following Java example contains SQL code to create and populate a table containing a collection of nasty binary columns, and then Java code that demonstrates the problem.
    I would really appreciate any workarounds that allow me to pull this data out of a single query.
    import java.sql.*;
    This class was developed to verify that you can't have a CLOB and a LONG column in the
    same SQL select statement, and extract both values. Calling get...() for the LONG column
    always causes 'java.sql.SQLException: Stream has already been closed'.
    CREATE TABLE BINARY_COLS_TEST
    PK INTEGER PRIMARY KEY NOT NULL,
    CLOB_COL CLOB,
    BLOB_COL BLOB,
    RAW_COL RAW(100),
    LONG_COL LONG
    INSERT INTO BINARY_COLS_TEST (
    PK,
    CLOB_COL,
    BLOB_COL,
    RAW_COL,
    LONG_COL
    ) VALUES (
    1,
    '-- clob value --',
    HEXTORAW('01020304050607'),
    HEXTORAW('01020304050607'),
    '-- long value --'
    public class JdbcLongTest
    public static void main(String argv[])
    throws Exception
    Driver driver = (Driver)Class.forName("oracle.jdbc.driver.OracleDriver").newInstance();
    DriverManager.registerDriver(driver);
    Connection connection = DriverManager.getConnection(argv[0], argv[1], argv[2]);
    Statement stmt = connection.createStatement();
    ResultSet results = null;
    try
    String query = "SELECT pk, clob_col, blob_col, raw_col, long_col FROM binary_cols_test";
    results = stmt.executeQuery(query);
    while (results.next())
    int pk = results.getInt(1);
    System.out.println("Loaded int");
    Clob clob = results.getClob(2);
    // It doesn't work if you just close the ascii stream.
    // clob.getAsciiStream().close();
    String clobString = clob.getSubString(1, (int)clob.length());
    System.out.println("Loaded CLOB");
    // Streaming not strictly necessary for short values.
    // Blob blob = results.getBlob(3);
    byte blobData[] = results.getBytes(3);
    System.out.println("Loaded BLOB");
    byte rawData[] = results.getBytes(4);
    System.out.println("Loaded RAW");
    byte longData[] = results.getBytes(5);
    System.out.println("Loaded LONG");
    catch (SQLException e)
    e.printStackTrace();
    results.close();
    stmt.close();
    connection.close();
    } // public class JdbcLongTest

    The problem is that LONGs are not buffered but are read from the wire in the order defined. The problem is the same as
    rs = stmt.executeQuery("select myLong, myNumber from tab");
    while (rs.next()) {
    int n = rs.getInt(2);
    String s = rs.getString(1);
    The above will fail for the same reason. When the statement is executed the LONG is not read immediately. It is buffered in the server waiting to be read. When getInt is called the driver reads the bytes of the LONG and throws them away so that it can get to the NUMBER and read it. Then when getString is called the LONG value is gone so you get an exception.
    Similar problem here. When the query is executed the CLOB and BLOB locators are read from the wire, but the LONG is buffered in the server waiting to be read. When Clob.getString is called, it has to talk to the server to get the value of the CLOB, so it reads the LONG bytes from the wire and throws them away. That clears the connection so that it can ask the server for the CLOB bytes. When the code reads the LONG value, those bytes are gone so you get an exception.
    This is a long standing restriction on using LONG and LONG RAW values and is a result of the network protocol. It is one of the reasons that Oracle deprecates LONGs and recommends using BLOBs and CLOBs instead.
    Douglas

Maybe you are looking for

  • The Database Link is not active

    try to be more clear, i'm in lack of ideas in this problem. I am following guide Oracle Database 2 Day + Data Replication and Integration Guide. I defined global_names parameter of remote database as true.In the step of "creation database link" i am

  • Apple IE DAV has stopped working

    I use a PC (Windows) laptop for work and I have itunes installed and use it to manage and sync with my iPad. I use iCloud and iMatch.   I also use outlook and have it synced to iCloud. Well, all of a sudden I am now getting this message when I start

  • HOW TO RUN ABAP SERVER PROXY IN BACKGROUND?

    Hi All, I have a ABAP Server Proxy running in WAS 6.20. This Server Proxy runs as a foreground Process when the scenario is executed. if we run a scenario with huge load or run scenarios with continuous load, due to the fact that the server proxy run

  • External Microphone Signal Issues

    I am trying to record my guitar with a new microphone i purchased. I have the microphone on the aplifier connected to a  small sound board and from the soundboard output it goes into the input (headphone Jack) on the computer. I went into preferences

  • Msn messenger password question!!

    i want msn or keychain whatever, to remember all the passwords i logged in.. i am logging in with one user name and password and checking the remember box and it remembers that but the problem is when i log in with another username and password it fo