Best way to Fetch the record

Hi,
Please suggest me the best way to fetch the record from the table designed below. It is Oracle 10gR2 on Linux
Whenever a client visit the office a record will be created for him. The company policy is to maintain 10 years of data on the transaction table but the table holds record count of 3 Million records per year.
The table has the following key Columns for the Select (sample Table)
Client_Visit
ID Number(12,0) --sequence generated number
EFF_DTE DATE --effective date of the customer (sometimes the client becomes invalid and he will be valid again)
Create_TS Timestamp(6)
Client_ID Number(9,0)
Cascade Flg vahrchar2(1)
On most of the reports the records are fetched by Max(eff_dte) and Max(create_ts) and cascade flag ='Y'.
I have following queries but the both of them are not cost effective and takes 8 minutes to display the records.
Code 1:
SELECT   au_subtyp1.au_id_k,
                                   au_subtyp1.pgm_struct_id_k
                              FROM au_subtyp au_subtyp1
                             WHERE au_subtyp1.create_ts =
                                      (SELECT MAX (au_subtyp2.create_ts)
                                         FROM au_subtyp au_subtyp2
                                        WHERE au_subtyp2.au_id_k =
                                                            au_subtyp1.au_id_k
                                          AND au_subtyp2.create_ts <
                                                 TO_DATE ('2013-01-01',
                                                          'YYYY-MM-DD'
                                          AND au_subtyp2.eff_dte =
                                                 (SELECT MAX
                                                            (au_subtyp3.eff_dte
                                                    FROM au_subtyp au_subtyp3
                                                   WHERE au_subtyp3.au_id_k =
                                                            au_subtyp2.au_id_k
                                                     AND au_subtyp3.create_ts <
                                                            TO_DATE
                                                                ('2013-01-01',
                                                                 'YYYY-MM-DD'
                                                     AND au_subtyp3.eff_dte < =
                                                            TO_DATE
                                                                ('2012-12-31',
                                                                 'YYYY-MM-DD'
                               AND au_subtyp1.exists_flg = 'Y'
Explain Plan
Plan hash value: 2534321861
| Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
|   0 | SELECT STATEMENT         |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
|*  1 |  FILTER                  |           |       |       |       |            |          |
|   2 |   HASH GROUP BY          |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
|*  3 |    HASH JOIN             |           |  1404K|   121M|    19M| 33178   (1)| 00:06:39 |
|*  4 |     HASH JOIN            |           |   307K|    16M|  8712K| 23708   (1)| 00:04:45 |
|   5 |      VIEW                | VW_SQ_1   |   307K|  5104K|       | 13493   (1)| 00:02:42 |
|   6 |       HASH GROUP BY      |           |   307K|    13M|   191M| 13493   (1)| 00:02:42 |
|*  7 |        INDEX FULL SCAN   | AUSU_PK   |  2809K|   125M|       | 13493   (1)| 00:02:42 |
|*  8 |      INDEX FAST FULL SCAN| AUSU_PK   |  2809K|   104M|       |  2977   (2)| 00:00:36 |
|*  9 |     TABLE ACCESS FULL    | AU_SUBTYP |  1404K|    46M|       |  5336   (2)| 00:01:05 |
Predicate Information (identified by operation id):
   1 - filter("AU_SUBTYP1"."CREATE_TS"=MAX("AU_SUBTYP2"."CREATE_TS"))
   3 - access("AU_SUBTYP2"."AU_ID_K"="AU_SUBTYP1"."AU_ID_K")
   4 - access("AU_SUBTYP2"."EFF_DTE"="VW_COL_1" AND "AU_ID_K"="AU_SUBTYP2"."AU_ID_K")
   7 - access("AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd
              hh24:mi:ss') AND "AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
       filter("AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND
              "AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))
   8 - filter("AU_SUBTYP2"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
   9 - filter("AU_SUBTYP1"."EXISTS_FLG"='Y')Code 2:
I already raised a thread a week back and Dom suggested the following query, it is cost effective but the performance is same and used the same amount of Temp tablespace
select au_id_k,pgm_struct_id_k from (
SELECT au_id_k
      ,      pgm_struct_id_k
      ,      ROW_NUMBER() OVER (PARTITION BY au_id_k ORDER BY eff_dte DESC, create_ts DESC) rn,
      create_ts, eff_dte,exists_flg
      FROM   au_subtyp
      WHERE  create_ts < TO_DATE('2013-01-01','YYYY-MM-DD')
      AND    eff_dte  <= TO_DATE('2012-12-31','YYYY-MM-DD') 
      ) d  where rn =1   and exists_flg = 'Y'
--Explain Plan
Plan hash value: 4039566059
| Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
|   0 | SELECT STATEMENT         |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
|*  1 |  VIEW                    |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
|*  2 |   WINDOW SORT PUSHED RANK|           |  2809K|   133M|   365M| 40034   (1)| 00:08:01 |
|*  3 |    TABLE ACCESS FULL     | AU_SUBTYP |  2809K|   133M|       |  5345   (2)| 00:01:05 |
Predicate Information (identified by operation id):
   1 - filter("RN"=1 AND "EXISTS_FLG"='Y')
   2 - filter(ROW_NUMBER() OVER ( PARTITION BY "AU_ID_K" ORDER BY
              INTERNAL_FUNCTION("EFF_DTE") DESC ,INTERNAL_FUNCTION("CREATE_TS") DESC )<=1)
   3 - filter("CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND "EFF_DTE"<=TO_DATE('
              2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))Thanks,
Vijay

Hi Justin,
Thanks for your reply. I am running this on our Test environment as I don't want to run this on Production environment now. The test environment holds 2809605 records (2 Million).
The query output count is 281699 (2 Hundred Thousand) records and the selectivity is 0.099. The Distinct values of create_ts, eff_dte, and exists_flg is 2808905 records. I am sure the index scan is not going to help out much as you said.
The core problem is both queries are using lot of Temp tablespace. When we use this query to join the tables, the other table has the same design as below so the temp tablespace grows bigger.
Both the production and test environment are 3 Node RAC.
First Query...
CPU used by this session     4740
CPU used when call started     4740
Cached Commit SCN referenced     21393
DB time     4745
OS Involuntary context switches     467
OS Page reclaims     64253
OS System time used     26
OS User time used     4562
OS Voluntary context switches     16
SQL*Net roundtrips to/from client     9
bytes received via SQL*Net from client     2487
bytes sent via SQL*Net to client     15830
calls to get snapshot scn: kcmgss     37
consistent gets     52162
consistent gets - examination     2
consistent gets from cache     52162
enqueue releases     19
enqueue requests     19
enqueue waits     1
execute count     2
ges messages sent     1
global enqueue gets sync     19
global enqueue releases     19
index fast full scans (full)     1
index scans kdiixs1     1
no work - consistent read gets     52125
opened cursors cumulative     2
parse count (hard)     1
parse count (total)     2
parse time cpu     1
parse time elapsed     1
physical write IO requests     69
physical write bytes     17522688
physical write total IO requests     69
physical write total bytes     17522688
physical write total multi block requests     69
physical writes     2139
physical writes direct     2139
physical writes direct temporary tablespace     2139
physical writes non checkpoint     2139
recursive calls     19
recursive cpu usage     1
session cursor cache hits     1
session logical reads     52162
sorts (memory)     2
sorts (rows)     760
table scan blocks gotten     23856
table scan rows gotten     2809607
table scans (short tables)     1
user I/O wait time     1
user calls     11
workarea executions - onepass     1
workarea executions - optimal     9
Second Query
CPU used by this session     1197
CPU used when call started     1197
Cached Commit SCN referenced     21393
DB time     1201
OS Involuntary context switches     8684
OS Page reclaims     21769
OS System time used     14
OS User time used     1183
OS Voluntary context switches     50
SQL*Net roundtrips to/from client     9
bytes received via SQL*Net from client     767
bytes sent via SQL*Net to client     15745
calls to get snapshot scn: kcmgss     17
consistent gets     23871
consistent gets from cache     23871
db block gets     16
db block gets from cache     16
enqueue releases     25
enqueue requests     25
enqueue waits     1
execute count     2
free buffer requested     1
ges messages sent     1
global enqueue get time     1
global enqueue gets sync     25
global enqueue releases     25
no work - consistent read gets     23856
opened cursors cumulative     2
parse count (hard)     1
parse count (total)     2
parse time elapsed     1
physical read IO requests     27
physical read bytes     6635520
physical read total IO requests     27
physical read total bytes     6635520
physical read total multi block requests     27
physical reads     810
physical reads direct     810
physical reads direct temporary tablespace     810
physical write IO requests     117
physical write bytes     24584192
physical write total IO requests     117
physical write total bytes     24584192
physical write total multi block requests     117
physical writes     3001
physical writes direct     3001
physical writes direct temporary tablespace     3001
physical writes non checkpoint     3001
recursive calls     25
session cursor cache hits     1
session logical reads     23887
sorts (disk)     1
sorts (memory)     2
sorts (rows)     2810365
table scan blocks gotten     23856
table scan rows gotten     2809607
table scans (short tables)     1
user I/O wait time     2
user calls     11
workarea executions - onepass     1
workarea executions - optimal     5Thanks,
Vijay
Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:17 AM
Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:19 AM

Similar Messages

  • What is the best way to do voice recording in a Macbook Pro?

    What is the best way to do voice recording in a Macbook Pro.I want to voice record and sendas a MP3 file.
               Thanks

    Deleting the application from your /Applications folder is sufficient. There are sample projects in /Library/Application/Aperture you may want to get rid of as well, as they take up a fair bit of space.

  • What is the best way to get the end of record from internal table?

    Hi,
    what is the best way to get the latest year and month ?
    the end of record(KD00011001H 1110 2007  11)
    Not KE00012002H, KA00012003H
    any function for MBEWH table ?
    MATNR                 BWKEY      LFGJA LFMON
    ========================================
    KE00012002H        1210             2005  12
    KE00012002H        1210             2006  12
    KA00012003H        1000             2006  12
    KD00011001H        1110             2005  12
    KD00011001H        1110             2006  12
    KD00011001H        1110             2007  05
    KD00011001H        1110             2007  08
    KD00011001H        1110             2007  09
    KD00011001H        1110             2007  10
    KD00011001H        1110             2007  11
    thank you
    dennis
    Edited by: ogawa Dennis on Jan 2, 2008 1:28 AM
    Edited by: ogawa Dennis on Jan 2, 2008 1:33 AM

    Hi dennis,
    you can try this:
    Sort <your internal_table MBEWH> BY lfgja DESCENDING lfmon DESCENDING.
    Thanks
    William Wilstroth

  • How to do Query optimization?It takes more time to fetch the record from db. Very urgent, I need your assistance

    Hi all
                                     I want to fetch just twenty thousands records from table. My query take more time to fetch  twenty thousands records.  I post my working query, Could you correct the query for me. thanks in advance.
    Query                    
    select
    b.Concatenated_account Account,
    b.Account_description description,
    SUM(case when(Bl.ACTUAL_FLAG='B') then
    ((NVL(Bl.PERIOD_NET_DR, 0)- NVL(Bl.PERIOD_NET_CR, 0)) + (NVL(Bl.PROJECT_TO_DATE_DR, 0)- NVL(Bl.PROJECT_TO_DATE_CR, 0)))end) "Budget_2011"
    from
    gl_balances Bl,
    gl_code_combinations GCC,
    psb_ws_line_balances_i b ,
    gl_budget_versions bv,
    gl_budgets_v gv
    where
    b.CODE_COMBINATION_ID=gcc.CODE_COMBINATION_ID and bl.CODE_COMBINATION_ID=gcc.CODE_COMBINATION_ID and
    bl.budget_version_id =bv.BUDGET_VERSION_ID and gv.budget_version_id= bv.budget_version_id
    and gv.latest_opened_year in (select latest_opened_year-3 from gl_budgets_v where latest_opened_year=:BUDGET_YEAR )
    group by b.Concatenated_account ,b.Account_description

    Hi,
    If this question is related to SQL then please post in SQL forum.
    Otherwise provide more information how this sql is being used and do you want to tune the SQL or the way it fetches the information from DB and display in OAF.
    Regards,
    Sandeep M.

  • Best Way to port the data from one DB to another DB using Biztalk

    Hi,
    please suggest best way to move the data from one db to another DB using biztalk.
    Currently I am doing like that, for each transaction(getting from different source tables) through receive port, and do some mapping (some custom logic for data mapping), then insert to target normalized tables(multiple tables) and back to update the status
    of transaction in source table in sourceDB. It is processing one by one.
    How/best we we can do it using  bulk transfer and update the status. Since it has more than 10000 transaction per call.
    Thanks,
    Vinoth

    Hi Vinoth,
    For SQL Bulk inserts you can always use SQL Bulk Load
    adapter.
    http://www.biztalkgurus.com/biztalk_server/biztalk_blogs/b/biztalksyn/archive/2005/10/23/processing-a-large-flat-file-message-with-biztalk-and-the-sqlbulkinsert-adapter.aspx
    However, even though a SQL Bulk Load adapter can efficiently insert a large amount of data into SQL you are still stuck with the issues of transmitting the
    MessageBox database and the memory issues of dealing with really large messages.
    I would personally suggest you to use SSIS, as you have mentioned that records have to be processed in specific time of day as opposed to when the
    records are available.
    Please refer to this link to get more information about SSIS: http://msdn.microsoft.com/en-us/library/ms141026.aspx
    If you have any more questions related to SSIS, please ask it in
    SSIS 
    forum and you will get specific support.
    Rachit

  • Best way to show the following data?

    Hi Experts,
    I am designing a validation solution in java webdynpro.
    I need to process 1000 records against certain logic and need to show the errors in the interface..Each record has more than 30 columns. I would like to get your suggestions on,
    1. How to display 1000 records with nearly 30 columns? A normal table with scrolling is enough or i need to split into 2 or 3 tabs based on some grouping and show these 1000 records?
    2. What would be the best way to show the error messages on these records?, Is it advisable to show the error messages on a tooltip? Is it possible to show a tooltip for each record in a table?
    Any suggestions would be really helpful for my work...
    Thanks in advance..
    Regards,
    Senthil
    Edited by: Senthilkumar.Paulchamy on Jun 23, 2011 9:18 PM

    Hi,
    Depending on your data would use a TableColumnGroup to group your 30 columns under a common header (displaying only the most significant data)
    Expanding the row would then show all 30 columns' data
    To display the errors, I would display the common header rows in bold red to make them stand out from the non-erroneous rows.
    Furthermore, I would add a toggle button, to alternate between 'all records' and 'erroneous records only'
    Best,
    Robin

  • Best Way To merge Customer Records

    Hi community,
    What is the best way to merge customer records for the same operson but may have used different email ids to correspond with BC Website

    Not in BC no. You would need to export a custom report, sort that in excel and then form that into the customer import it back in or create some API that goes through and cleans that up.

  • I have lightroom 5.7. Now I have apple TV to connect my Mac to the TV scree. I wish to do a slide show on the TV. However, on the Mac, using ITunes to share photos with the TV, I cannot locate all photo files. What is the best way to save the slide show a

    I have lightroom 5.7. Now I have apple TV to connect my Mac to the TV scree. I wish to do a slide show on the TV. However, on the Mac, using ITunes to share photos with the TV, I cannot locate all photo files. What is the best way to save the slide show and put it where the Mac sharing can find it? So far, I made on one folder in Lightroom, put some photos there and I found them easily. Can this be done with a slide show? Please help quickly! Also worried that the photos I put on the new folder are hard to find afterwards, when the show is done. Where do they go when I delete from from the new folder?I am not alone

    Not that I'm aware of. You just export JPEG copies to a folder that you can point iTunes to. For instance, I have created a folder in my Pictures folder called Apple TV. And within that folder I have other folders of pictures that I can choose from in iTunes to share with Apple TV. But there doesn't seem to be any way to share a Lightroom slideshow. If you have laid to create a video file that would probably work. Apple TV is a little clunky in my opinion. Some things are a little more difficult to do now than they were a while back. I probably haven't provided you with much help, but just keep experimenting and I think you will figure it out.

  • I am moving from PC to Mac.  My PC has two internal drives and I have a 3Tb external.  What is best way to move the data from the internal drives to Mac and the best way to make the external drive read write without losing data

    I am moving from PC to Mac.  My PC has two internal drives and I have a 3Tb external.  What is best way to move the data from the internal drives to Mac and the best way to make the external drive read write without losing data

    Paragon even has non-destriuctive conversion utility if you do want to change drive.
    Hard to imagine using 3TB that isn't NTFS. Mac uses GPT for default partition type as well as HFS+
    www.paragon-software.com
    Some general Apple Help www.apple.com/support/
    Also,
    Mac OS X Help
    http://www.apple.com/support/macbasics/
    Isolating Issues in Mac OS
    http://support.apple.com/kb/TS1388
    https://www.apple.com/support/osx/
    https://www.apple.com/support/quickassist/
    http://www.apple.com/support/mac101/help/
    http://www.apple.com/support/mac101/tour/
    Get Help with your Product
    http://docs.info.apple.com/article.html?artnum=304725
    Apple Mac App Store
    https://discussions.apple.com/community/mac_app_store/using_mac_apple_store
    How to Buy Mac OS X Mountain Lion/Lion
    http://www.apple.com/osx/how-to-upgrade/
    TimeMachine 101
    https://support.apple.com/kb/HT1427
    http://www.apple.com/support/timemachine
    Mac OS X Community
    https://discussions.apple.com/community/mac_os

  • What is the best way to mimic the data from production to other server?

    Hi,
    here we user streams and advanced replication to send the data for 90% of tables from production to another production database server. if one goes down can use another one. is there any other best option rather using the streams and replication? we are having lot of problems with streams these days they keep break and get calls.
    I heard about data guard but dont know what is use of it? please advice the best way to replicate the data.
    Thanks a lot.....

    RAC, Data Guard. The first one is active-active, that is, you have two or more nodes accessing the same database on shared storage and you get both HA and load balancing. The second is active-passive (unless you're on 11.2 with Active Standby or Snapshot Standby), that is one database is primary and the other is standby, which you normally cannot query or modify, but to which you can quickly switch in case primary fails. There's also Logical Standby - it's based on Streams and generally looks like what you seem to be using now (sort of.) But it definitely has issues. You can also take a look at GoldenGate or SharePlex.

  • Can we split and fetch the records in Database Adapter

    Hi,
    I designed a Database Adapter to fetch the records from oracle Database. Some time, the Database Adapter need to fetch around 5000, or 10,000 records in single shot. In that case my BPEL process is choking and getting error as
    java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2882) at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
    Could someone help me to resolve this?
    In Database Adapter can we split and fetch the records, if number of records more then 1000.
    ex. First 100 rec as one set and next 100 as 2nd set like this.
    Thank you.

    You can send the records as batches useing the debatching feature of db adapter. Refer documentation for implementation details.

  • I have a iMac Desktop and MacBook Pro laptop. I generate my work related files on both of these machines. What is the best way to keep the files in these machines synchronized?

    I have a iMac Desktop and MacBook Pro laptop. I generate my work related files on both of these machines. After a few days or weeks, I have new files on some folders on either of the machines that is not on the other machine. What is the best way to keep the files in these machines synchronized?

    How did you transfer the files to the iMac.  If you exported the files out of the MB library using Kind = Current you should get the edited version.  Any other  option may not.
    If you want to keep the two libraries "synced"  any photos you want to move to the iMac should be added to an album, connect the two Mac with a LAN, Target Disk Mode,  Transferring files between two computers using FireWire, with WiFi. and use the paid version of  iPhoto Library Manager to copy that album from the MB library to iMac library.  It will also copy the original and edited versions, keywords, titles, etc.
    OT

  • What is the best way to keep the battery built-in last longer?

    what is the best way to keep the battery built-in in macbook pro last longer???..
    is it by plugged in as many times as we can..and discharge it to 80-90% a week..then charged it back?

    My favorite tip for longer battery life: Uninstall Flash

  • Iphone 4s coming friday, what is the best way to get the notes content from iphone 4 to 4s without doing a restore? i want the new phone to be totally new but not sure how to get notes content across.

    What is the best way to get the notes content from iphone 4 to 4s without doing a restore? i want the new phone to be totally new but not sure how to get notes content across. If I do a restore as I have when previously from one iphone to another it has shown (in settings, usage) the cumulative usage from previous phones so all the hours of calls on all previous iphones will be displayed even though its brand new. Anyone know how I can get my notes (from standard iphone notes app) to my new iphone 4s without restoring from previous iphone 4. Thanks for any help offered.

    First, if you haven't updated to iTunes 10.5, please update now as you will need it for the iPhone 4S and iOS 5.
    Once you're done installing iTunes 10.5, open it. Connect your iPhone to iTunes using the USB cable. Once your iPhone pops up right click on it. For example: an iPhone will appear and it will say "Ryan's iPhone."
    Right click on it and select "Backup" from the dropdown menu. It will start backing up. This should backup your notes.
    Please tell me if you have any problems with backing up.
    Once you backup and get your iPhone 4S, you must follow these steps. If you don't follow these steps, you will not be able to get your notes on your new iPhone 4S.
    Open up iTunes again then right click on your device (iPhone 4S). Once you do you will see a dropdown menu. It will say "Restore from Backup..." Select this and it'll ask for a backup, select it from the dropdown menu. For example "Ryan's iPhone - October 12, 2011." Pick that and it will restore to your backup. Do this when you get your iPhone 4S so you will not lose anything. Even though you're restoring, you're getting back, since you're getting the previous settings, notes, contacts, mail and other settings from your old iPhone. You'll still have Siri though! So, restore when you first get it. Also frequently backup your device, as it will be worth it. You can restore from a backup if something goes wrong or save your data for a future update.
    Once you do that, you should have your notes on your new iPhone 4S and iOS 5.
    Please tell me if you need any help.
    I hoped I answered your questions and solved your problem!

  • What is the best way to export the data out of BW into a flat file on the S

    Hi All,
    We are BW 7.01 (EHP 1, Service Pack Level 7).
    As part of our BW project scope for our current release, we will be developing certain reports in BW, and for certain reports, the existing legacy reporting system based out of MS Access and the old version of Business Objects Release 2 would be used, with the needed data supplied from the BW system.
    What is the best way to export the data out of BW into a flat file on the Server on regular intervals using a process chain?
    Thanks in advance,
    - Shashi

    Hello Shashi,
    some comments:
    1) An "open hub license" is required for all processes that extract data from BW to a non-SAP system (including APD). Please check with your SAP Account Executive for details.
    2) The limitation of 16 key fields is only valid when using open hub for extracting to a DB table. There's no such limitation when writing files.
    3) Open hub is the recommended solution since it's the easiest to implement, no programming is required, and you don't have to worry much about scaling with higher data volumes (APD and CRM BAPI are quite different in all of these aspects).
    For completeness, here's the most recent documentation which also lists other options:
    http://help.sap.com/saphelp_nw73/helpdata/en/0a/0212b4335542a5ae2ecf9a51fbfc96/frameset.htm
    Regards,
    Marc
    SAP Customer Solution Adoption (CSA)

Maybe you are looking for

  • How do I install Java 6 on my Mac OS X 10.6.8?

    Just purchased a GoPro3 and it needs an important update, however my Mac keeps telling me to install Java 6, but when I go to my Mac Software Update it tells me everything is up to date.

  • Unsupported value: 'inherit'

    hi there ok here goes,                                  i have created a web gallery in lightroom and taken it to dreamweaver when i check browser compatability i receive errors of unsupported values, and properties the most frequent is "inherit" but

  • Chinese Characters Printing as Boxes in SAPscript

    Issue...why would a Chinese standard text print fine using the "INCLUDE" method, but output as boxes when that same Chinese standard text is retrieved successfully using the "PERFORM" method? Summary...I am currently experiencing problems printing Ch

  • Picture variable to image (IMAQ)

    I am trying to convert a "picture" type to "image type" . The "picture" type is from Graphics & Sound toolbox and image type comes from Vision and Motion toolbox. Writing to a file like jpg and rereading it as an image is not an option. The picuture

  • Changing an Array of Strings to an Array of Double or Int

    How would I change a string of arrays to ints or doubles...here my code:      File inputFile = new File("testing");         FileReader in = new FileReader(inputFile);      BufferedReader bufin = new BufferedReader(in);      String c;      String[] x