[nQSError: 17012] Bulk fetch failed. (HY000)

Hi All,
Some times my report through's  the following error message:
ORA-03135: Attached the query which results into an error after running for 31 minutes. Below is the error:
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 17001] Oracle Error code: 3135, message: ORA-03135: connection lost contact Process ID: 25523 Session ID: 774 Serial number: 19622 at OCI call OCIStmtFetch. [nQSError: 17012] Bulk fetch failed. (HY000)
Please give me the solution.
Thanks&Regards,
Nantha.

I see the irony was lost as your reply remained unprecise and un-informative. "Server side everything ok" - what does that even mean? BI server? Database server? What about the network? What about firewall issues with the expire_time parameter in the sqlnet.ora? Are you working with virtual machines on either side (or both)?
http://catb.org/~esr/faqs/smart-questions.html

Similar Messages

  • SQL SERVER BULK FETCH AND INSERT/UPDATE?

    Hi All,
           I am currently working with C and SQL Server 2012. My requirement is to Bulk fetch the records and Insert/Update the same in the other table with some  business logic?
           How do i do this?
           Thanks in Advance.
    Regards
    Yogesh.B

    > is there a possibility that I can do a bulk fetch and place it in an array, even inside a stored procedure ?
    You can use Temporary tables or Table variables and have them indexes as well
    >After I have processed my records, tell me a way that I will NOT go, RECORD by RECORD basis, even inside a stored procedure ?
    As i said earlier, you can perform UPDATE these temporary tables or table variables and finally INSERT/ UPDATE your base table
    >Arrays are used just to minimize the traffic between the server and the program area. They are used for efficient processing.
    In your case you will first have to populate the array (Using some of your queries from the server) which means you will first load the arrary, do some updates, and then send them back to server therefore
    network engagement
    So I just gave you some thoughts I feel could be useful for your implementation, like we say, there are many ways so pick the one that works good for you in the long run with good scalability
    Good Luck! Please Mark This As Answer if it solved your issue. Please Vote This As Helpful if it helps to solve your issue

  • NQSerror: 37005 Transactional Update Failed

    Working in the 11g repository in online mode. When attempting to check in changes, I'm getting this error:
    nQSError: 37005 Transactional Update Failed
    If I go to the Fusion Middleware Control and restart all OBIEE services, everything is fine again for a little while, but it eventually decides to stop accepting online changes and produces the same error message.
    Consistency check passes.

    Well, after a month's absence our little friend NQSerror 37005 is back. It is showing up when I'm attempting to check in changes and save the repository (online mode). The only change I made was a permissions change to one of my subject areas.
    Here are the things I've tried to diagnose the problem, with the word SUCCESS if I received no error, or the word FAIL if I got the 37005 error. In all cases, after a 37005 error I've reconnected to the repository in online mode, not allowing the 37005 to get stuck in memory from one test to the next.
    1. Offline: Make a copy of the repository, open in offline mode, change subject area permissions, save: SUCCESS
    2. Online: Add a new fact in the business model, attempt to check in changes and save: SUCCESS
    3. Online: Move the new fact into the presentation layer, check in changes and save: SUCCESS
    4. Online: Change permissions for a subject area and check in changes: FAIL
    5. Online: Change permissions for a presentation layer table and check in changes: SUCCESS
    6. Online: OK, now this is really strange: After doing #5, do #4 again (the one that failed the first time). Now it succeeds. Go figure.
    Since the problem has now gone back into hiding, I'll suspend this post. If the problem appears again, I'll resume the post.
    Note to myself: In the case of a 37005 error, make a mental note of what was changed prior to doing a check-in. Reconnect in online mode and try the exact same thing again. If that error appears again, reconnect in online mode, and try setting permissions for a presentation layer table then checking in changes. If that succeeds, try whatever didn't work before.

  • VLAN RUNNING Config fetch Failed

    Hi;
    I have running Ciscowork LSM 3.2 with RME Ver 4.3 and Running SNMP V3 on all device. Out of 134 device 12-13 shows me Vlan running Config fetch failed. Please also find the dcmaservice.log in attachment.
    I already verify the Crediential number or times, I can access the devices from SSH and Telnet, on Cisco works RME shows Running and Startup Config success while VLAN Running Config fetch failed. and on some device it show following
    Protocol and Platforms passed = TELNET , RMEIOS
    trying for Telnet
    Failed to get CmdSvc from DeviceContext....
    Thanks,
    Best regards;
    Shoaib Ahmed

    HI Shoaib,
    Try  ssh\telnet into the device manually and issue the command copy flash:vlan.dat tftp: and
    enter the IP address of the LMS server? Does it work? If that works fine, try to increase
    the tftp timeout.
    This can be done under RME> Admin> System Preferences> RME Device Attributes.
    if it failed with an Error as below :
    %Error opening tftp: ............
    Then check if there any firewall in between these device and LMS blocking TFTP as TFTP is the only protocal
    that fetch the vlan.dat from the device.
    Thanks--
    Afroj

  • Config Fetch Failed

    The error message below is from a device that I am trying to sync a configuration, device is using TACACS:
    RUNNING
    CM0151 PRIMARY RUNNING config fetch failed for ROUTER Cause: TELNET: Failed to establish a Telnet connection to
    X.X.X.X Cause connection refused.
    SSH: Same message as aboove except authentication failed on device 3 times.
    ACTION: Check if protocal is supported and required on device package is installed.

    The error about authentication is usually correct.  Go to Common Services > Device and Credentials > Device Management, select the device in question, then export the credentials.  Look at the CSV export, and take the username and password, and try to SSH to the device from the LMS server.  Verify you can login and get into enable mode using the same credentials in DCR.

  • Bulk fetch met dbms_hs_passthrough

    Can we use bulk fetch with dbms_hs_passthrough ?
    Current i use code like this:
    c := [email protected];
    [email protected](c,
    'select
    RecordId
    , some_VERY_long_column_from_mssoft_sql_server as shortcol
    from table
    LOOP
    nr := [email protected](c);
    EXIT
    WHEN nr = 0;
    [email protected](c,1,l_obj_type.RECORDID );
    I am on 9.2 and would like to use bulk fetch; the source table consists of more than a milion records.
    Edited by: FourEyes on Feb 16, 2009 4:51 PM

    Hi,
    DBMS_HS_PASSTHROUGH has no option to do bulk fetch. However they do support result sets for some type of gateways. If you are using Generic Connectivity (HSODBC), result sets are not supported.
    Regards,
    Ed

  • Which method does the actual bulk fetch from database in ADF?

    Hi,
    I'm looking to instrument my ADF code to see where bottlenecks are. Does anyone know which method does the bulk fetch from the database so that I can override it?
    Thanks
    Kevin

    Hi,
    I think you need to be more specific. ADF is a meta
    data binding layer that delegates data queries to the
    business service
    FrankSorry - to be specific I probably mean BC4J - when a query runs in a view object.

  • CommandFailedException : A350 NO FETCH failed

    javaMail 1.4.4
    demo msgshow.java using IMAP protocol
    An exception occurs during reading a "Undeliverable" message genérated by Microsoft Outlook 14.0
    This is the message envelope
    Oops, got exception! A350 NO FETCH failed. Mail cannot be parsed.
    javax.mail.MessagingException: A350 NO FETCH failed. Mail cannot be parsed.;
    nested exception is:
    +     com.sun.mail.iap.CommandFailedException: A350 NO FETCH failed. Mail cannot be parsed.+
    MESSAGE #4:
    This is the message envelope
    +     at com.sun.mail.imap.IMAPMessage.loadEnvelope(IMAPMessage.java:1237)+
    +     at com.sun.mail.imap.IMAPMessage.getSubject(IMAPMessage.java:335)+
    +     at msgshow.dumpEnvelope(msgshow.java:518)+
    +     at msgshow.dumpPart(msgshow.java:312)+
    +     at msgshow.main(msgshow.java:263)+
    Caused by: com.sun.mail.iap.CommandFailedException: A350 NO FETCH failed. Mail cannot be parsed.
    +     at com.sun.mail.iap.Protocol.handleResult(Protocol.java:344)+
    +     at com.sun.mail.imap.IMAPMessage.loadEnvelope(IMAPMessage.java:1232)+
    +     ... 4 more+
    Do Javamail be able to parse this kind of message ?

    IMAP messages are normally parsed on the server. It's your server that's complaining that it can't parse the message.
    Please report the bug to your server vendor.
    The JavaMail FAQ has tips for how to work around such server bugs. Essentially, it involves downloading the message
    to the client and using JavaMail to parse it.

  • XML fetch failed -- possibly a problem with AvXml.dll or its permissions

    When attempting to access the Status Monitor I receive the following error message:
    XML fetch failed -- possibly a problem with AvXml.dll or its permissions.
    Check that the "AvXml" virtual directory settings in IIS have proper permissions and allow execute access.
    I've tried all combos of security on the IIS directory. What am I missing?

    Hi Lindborg
    I have that problem with version 4.0 (X), it is due?, As I fix?.
    Unity I have a failover cluster, I have several problems, sometimes not let me delete mailbox, I can retrieve messages.
    thanks for any help you can provide.
    Regard

  • Prb. with bulk fetch

    Hi,
    I am using 9i and i have to bulk fetch into type table of some object type
    my object is
    CREATE TYPE PART_SRN AS OBJECT
    PART_NO VARCHAR2(15),
    PRN_SRN VARCHAR2(20)
    and in pl/sql block i have declared type table i.e.
    TYPE REC_P IS TABLE OF PART_SRN;
    REC REC_P;
    now i need to bulk fetch query result into REC
    SELECT X,Y BULK COLLECT INTO ?
    FROM ABC;
    can anybody tell how to do it
    thanks in advance
    Piyush

    Like this?:
    create or replace type emp_typ as object (
       empno      number (4),
       ename      varchar (10),
       job        varchar (9),
       mgr        number (4),
       hiredate   date,
       sal        number (7, 2),
       comm       number (7, 2),
       deptno     number (2)
    declare
       type rec_p is table of emp_typ;
       rec   rec_p;
    begin
       select emp_typ (empno, ename, job, mgr, hiredate, sal, comm, deptno)
       bulk collect into rec
         from emp;
    end;

  • Bulk Fetch stored procedure.

    I am new to the oracle world.
    Does any one have a very good, but simple example of a bulk fetch that show the creation of the container variable?

    SQL> declare
      2   /* Declare index-by table of records type */
      3   type emp_rec_tab is table of emp%rowtype index by binary_integer;
      4 
      5   /* Declare table variable*/
      6   emptab emp_rec_tab;
      7 
      8   /* Declare REF CURSOR variable using SYS_REFCURSOR declaration
      9   in 9i and above */
    10   rcur sys_refcursor;
    11 
    12   /* Declare ordinar cursor */
    13   cursor ocur is select * from emp;
    14 
    15  begin
    16 
    17   /* bulk fetch using implicit cursor */
    18   select * bulk collect into emptab from emp;
    19   dbms_output.put_line( SQL%ROWCOUNT || ' rows fetched at once from implicit cursor');
    20   dbms_output.put_line('---------------------------------------------');
    21 
    22   /* bulk fetch from Ordinar cursor */
    23   open ocur;
    24   fetch ocur bulk collect into emptab;
    25   dbms_output.put_line( ocur%ROWCOUNT || ' rows fetched at once from ordinar cursor');
    26   dbms_output.put_line('---------------------------------------------');
    27   close ocur;
    28 
    29   /* bulk fetch from Ordinar cursor using LIMIT clause */
    30   open ocur;
    31   loop
    32    fetch ocur bulk collect into emptab limit 4;
    33    dbms_output.put_line(
    34      emptab.count ||
    35      ' rows fetched at one iteration from ordinar cursor using limit');
    36    exit when ocur%notfound;
    37   end loop;
    38   close ocur;
    39   dbms_output.put_line('---------------------------------------------');
    40 
    41   /* bulk fetch from ref cursor */
    42   open rcur for select * from emp;
    43   fetch rcur bulk collect into emptab;
    44   dbms_output.put_line( rcur%ROWCOUNT || ' rows fetched at once from ref cursor');
    45   dbms_output.put_line('---------------------------------------------');
    46   close rcur;
    47 
    48   /* bulk fetch from ref cursor using LIMIT clause */
    49   open rcur for select * from emp;
    50   loop
    51    fetch rcur bulk collect into emptab limit 4;
    52    dbms_output.put_line( emptab.count ||
    53    ' rows fetched at one iteration from ref cursor using limit');
    54    exit when rcur%notfound;
    55   end loop;
    56   close rcur;
    57   dbms_output.put_line('---------------------------------------------');
    58 
    59   /* bulk fetch using execute immediate */
    60   execute immediate 'select * from emp' bulk collect into emptab;
    61   dbms_output.put_line( SQL%ROWCOUNT || ' rows fetched using execute immediate');
    62   dbms_output.put_line('---------------------------------------------');
    63 
    64  end;
    65  /
    14 rows fetched at once from implicit cursor
    14 rows fetched at once from ordinar cursor
    4 rows fetched at one iteration from ordinar cursor using limit
    4 rows fetched at one iteration from ordinar cursor using limit
    4 rows fetched at one iteration from ordinar cursor using limit
    2 rows fetched at one iteration from ordinar cursor using limit
    14 rows fetched at once from ref cursor
    4 rows fetched at one iteration from ref cursor using limit
    4 rows fetched at one iteration from ref cursor using limit
    4 rows fetched at one iteration from ref cursor using limit
    2 rows fetched at one iteration from ref cursor using limit
    14 rows fetched using execute immediate
    &nbsp
    PL/SQL procedure successfully completed.Rgds.

  • Bulk Fetch Exception Handling

    How do you use exception in BULK FETCH?

    <How do you use exception in BULK FETCH?>
    I've never gotten an exception in a bulk fetch, aside from getting the error when the query select list does not match the variables its being selected into. I think that normal exception handling would apply.
    I'm assuming you mean BULK COLLECTS. Did you mean bulk binds, the FORALL statement?

  • Bulk fetch taking long time.

    I have created a procedure in which i am fetching data from remote db using database link.
    I have 1 million data to fetch and the row size is around 200 bytes ( The table is having 10 attribute sizing 20 bytes each )
    OPEN cur_noit;
    FETCH cur_noit BULK COLLECT INTO rec_get_cur_noit;
    CLOSE cur_noit;
    The problem is it is taking more than 4 hours just to fetch the data.I need to know the corresponding factor and check that factor and most importantly what can be done ...like:-
    1. If the DB link is slow ? how can i check the speed of DB link ?
    2. I am fetching large size so is my PGA full or not used in optimized way ? How can i check the size of PGA and also increase that ? and set the optimum value.
    My CPU usage seems fine.
    Please let me know what else could be the reasons also ?
    *I know i can use Limit clause in Bulk. Kindly let me know if it also could be the reason for my above problem                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Couple of more things:- I am using oracle 9i.
    1.I need to transform the data also(Multiplying column value with fixed integer or setting a variable with another string,local table has couple of more attribute for which i need to fetch values from another table), so it will not be the exact replication.
    2. I will not take all the rows from remote DB , i have a where clause by which i find the subset of what i want to copy.
    Do you think it is achievable by below methods ?
    Apologies, I am novice in this and just googled a bit about the method you suggested.So, Please ignore my noviceness
    Materialzed views:-
    -It is going to make a local copy of whole table there by taking space on my current DB.
    -If i make a materialezed view just before starting copying what difference i would make i.e i am again first copying it from remote db and then i will be fetching from this cursor (materialezed view). I am not sure aren''t we doing more processing now i.e Using network while making materialez view + fetching from this cursor there by taking same memory as previously.
    there is always a possibility of delay in refresh i.e when tuples are changed in remote DB and when i copy in my actual table from materialezed view.
    Merge:-
    I am using bulk collect and BULK Binding FORALL insert in my local table.Do you think this method would be faster and can solve the problem. I have explained above what i am intending to do..

  • Bulk Fetch from a Cursor

    Hi all,
         Can you please give your comments on the code below.
         we are facing an situation where the value of <cursor_name>%notfound is misleading. How we are overcoming the issue is moving the 'exit when cur_name%notfound' stmt just before the end loop.
    open l_my_cur;
    loop
    fetch l_my_cur bulk collect
    into l_details_array;
    --<< control_comes_here>>
    --<< l_details_array.count gives me the correct no of rows>>
    exit when l_inst_cur%NOTFOUND;
    --<< control never reaches here>>
    --<< %notfound is true>>
    --<< %notfound is false only when there are as many records fetched as the limit (if set)>>
    forall i in 1 .. l_count
    insert into my_table ....( .... ) values ( .... l_details_array(i) ...);
    --<< This is never executed :-( >>
    end loop;
    Thanks,
    Sunil.

    Read
    fetch l_my_cur bulk collect
    into l_details_array; as
    fetch l_my_cur bulk collect
    into l_details_array LIMIT 10000;
    I am trying to process 10,000 rows at a time from a possible 100,000 records.
    Sunil.
    Hi all,
         Can you please give your comments on the code below.
         we are facing an situation where the value of <cursor_name>%notfound is misleading. How we are overcoming the issue is moving the 'exit when cur_name%notfound' stmt just before the end loop.
    open l_my_cur;
    loop
    fetch l_my_cur bulk collect
    into l_details_array;
    --<< control_comes_here>>
    --<< l_details_array.count gives me the correct no of rows>>
    exit when l_inst_cur%NOTFOUND;
    --<< control never reaches here>>
    --<< %notfound is true>>
    --<< %notfound is false only when there are as many records fetched as the limit (if set)>>
    forall i in 1 .. l_count
    insert into my_table ....( .... ) values ( .... l_details_array(i) ...);
    --<< This is never executed :-( >>
    end loop;
    Thanks,
    Sunil.

  • SES web crawler not able to connect to a particular URL, fetching fails

    Using Linux64Bit SES 11.1.2.0
    With Crawling depth - 5
    Web crawler has been created on "http://www.advancedinnovationsinc.com"
    Document fetching is getting failed at the depth 2.
    I have enabled the DEBUG logging from which I got to know that all the URL which is on depth 2 or more are getting *"HTTP/1.1 400 Bad Request"* error.
    So is this the SES product issue or the usage problem
    Below is the snap shot of generated log.
    01:43:16:054 INFO filter_0 urlString:http://www.advancedinnovationsinc.com/../index.html
    01:43:16:054 INFO filter_0 hostname :www.advancedinnovationsinc.com
    01:43:16:054 INFO filter_0 filepath :/../index.html
    01:43:16:054 INFO filter_0 Port :-1
    01:43:16:054 INFO filter_0 useSSL :false
    01:43:16:054 INFO filter_0 useProxy :true
    01:43:16:054 INFO filter_0 ==== DEBUG==== ------ New readWebURL ENDS ----------
    01:43:16:304 INFO filter_0 ==== DEBUG==== m_header:[Ljava.lang.String;@122b7db1
    01:43:16:304 INFO filter_0 ==== DEBUG==== statusLine:HTTP/1.1 400 Bad Request
    01:43:16:304 INFO filter_0 ==== DEBUG==== start:9
    01:43:16:304 INFO filter_0 ==== DEBUG==== statuscode:400
    01:43:16:304 INFO filter_0 ==== DEBUG==== URLAcess.java:2232
    01:43:16:321 INFO filter_0 EQG-30009: http://www.advancedinnovationsinc.com/../index.html: Bad request
    01:43:16:321 INFO filter_0 Documents to process = 7
    01:43:16:329 INFO filter_0 ==== DEBUG==== m_currentURL:http://www.advancedinnovationsinc.com/../create.html
    01:43:16:329 INFO filter_0 ==== DEBUG==== m_urlString:http://www.advancedinnovationsinc.com/../create.html
    01:43:16:329 DEBUG filter_0 Processing http://www.advancedinnovationsinc.com/../create.html
    01:43:16:329 INFO filter_0 ==== DEBUG==== ------ New readWebURL STARTS ----------
    01:43:16:329 INFO filter_0 urlString:http://www.advancedinnovationsinc.com/../create.html
    01:43:16:329 INFO filter_0 hostname :www.advancedinnovationsinc.com
    01:43:16:330 INFO filter_0 filepath :/../create.html
    01:43:16:330 INFO filter_0 Port :-1
    01:43:16:330 INFO filter_0 useSSL :false
    01:43:16:330 INFO filter_0 useProxy :true
    01:43:16:330 INFO filter_0 ==== DEBUG==== ------ New readWebURL ENDS ----------
    01:43:16:736 INFO filter_0 ==== DEBUG==== m_header:[Ljava.lang.String;@122b7db1
    01:43:16:736 INFO filter_0 ==== DEBUG==== statusLine:HTTP/1.1 400 Bad Request
    01:43:16:736 INFO filter_0 ==== DEBUG==== start:9
    01:43:16:736 INFO filter_0 ==== DEBUG==== statuscode:400
    01:43:16:736 INFO filter_0 ==== DEBUG==== URLAcess.java:2232
    01:43:16:738 INFO filter_0 EQG-30009: http://www.advancedinnovationsinc.com/../create.html: Bad request

    Using Linux64Bit SES 11.1.2.0
    With Crawling depth - 5
    Web crawler has been created on "http://www.advancedinnovationsinc.com"
    Document fetching is getting failed at the depth 2.
    I have enabled the DEBUG logging from which I got to know that all the URL which is on depth 2 or more are getting *"HTTP/1.1 400 Bad Request"* error.
    So is this the SES product issue or the usage problem
    Below is the snap shot of generated log.
    01:43:16:054 INFO filter_0 urlString:http://www.advancedinnovationsinc.com/../index.html
    01:43:16:054 INFO filter_0 hostname :www.advancedinnovationsinc.com
    01:43:16:054 INFO filter_0 filepath :/../index.html
    01:43:16:054 INFO filter_0 Port :-1
    01:43:16:054 INFO filter_0 useSSL :false
    01:43:16:054 INFO filter_0 useProxy :true
    01:43:16:054 INFO filter_0 ==== DEBUG==== ------ New readWebURL ENDS ----------
    01:43:16:304 INFO filter_0 ==== DEBUG==== m_header:[Ljava.lang.String;@122b7db1
    01:43:16:304 INFO filter_0 ==== DEBUG==== statusLine:HTTP/1.1 400 Bad Request
    01:43:16:304 INFO filter_0 ==== DEBUG==== start:9
    01:43:16:304 INFO filter_0 ==== DEBUG==== statuscode:400
    01:43:16:304 INFO filter_0 ==== DEBUG==== URLAcess.java:2232
    01:43:16:321 INFO filter_0 EQG-30009: http://www.advancedinnovationsinc.com/../index.html: Bad request
    01:43:16:321 INFO filter_0 Documents to process = 7
    01:43:16:329 INFO filter_0 ==== DEBUG==== m_currentURL:http://www.advancedinnovationsinc.com/../create.html
    01:43:16:329 INFO filter_0 ==== DEBUG==== m_urlString:http://www.advancedinnovationsinc.com/../create.html
    01:43:16:329 DEBUG filter_0 Processing http://www.advancedinnovationsinc.com/../create.html
    01:43:16:329 INFO filter_0 ==== DEBUG==== ------ New readWebURL STARTS ----------
    01:43:16:329 INFO filter_0 urlString:http://www.advancedinnovationsinc.com/../create.html
    01:43:16:329 INFO filter_0 hostname :www.advancedinnovationsinc.com
    01:43:16:330 INFO filter_0 filepath :/../create.html
    01:43:16:330 INFO filter_0 Port :-1
    01:43:16:330 INFO filter_0 useSSL :false
    01:43:16:330 INFO filter_0 useProxy :true
    01:43:16:330 INFO filter_0 ==== DEBUG==== ------ New readWebURL ENDS ----------
    01:43:16:736 INFO filter_0 ==== DEBUG==== m_header:[Ljava.lang.String;@122b7db1
    01:43:16:736 INFO filter_0 ==== DEBUG==== statusLine:HTTP/1.1 400 Bad Request
    01:43:16:736 INFO filter_0 ==== DEBUG==== start:9
    01:43:16:736 INFO filter_0 ==== DEBUG==== statuscode:400
    01:43:16:736 INFO filter_0 ==== DEBUG==== URLAcess.java:2232
    01:43:16:738 INFO filter_0 EQG-30009: http://www.advancedinnovationsinc.com/../create.html: Bad request

Maybe you are looking for

  • Array to image

    I'm making a filter for an image. I used a pixel grabber then played with the array of color it gives. How can I turn back this array into an image without drawing pixel by pixel?

  • ASA 5512-X Cannot Connect

    I've pasted my config below,but cannot get the ASA onto the internet. *** NEW *** 7pm EST 6.6 Ok; I've been following your setup, but something is still amiss.... i.e. please see the running config below.. Please note that I have the 'modem/T1' into

  • Publishing from a different computer/online?

    Is it possible to publish/update your blog from a different computer? For example, if I'm on the windows side of my MacBook, or just on another computer all together that DOESN'T have iWeb, is it possible for me to post an update?

  • Cinema 4d Lite won't open on Mac

    Anyone else having this issue? From After Effects CC. It's never started up, and closes as soon as I try to open it.

  • ODI interface to HFM IC Transaction Module

    Hi, there is a way to build a data flow to HFM IC Transaction Module with ODI? Thx Edited by: user11129210 on 20-mag-2009 6.35