Best way to distribute the calls

Hi all,
I would like to ask a suggestion for the scripting.
Let's say I have 8 groups of agents and I have a menu step for user input. Once the caller selects the option, what will be recommended for the next step?
1. Set the queue to the mapped CSQ, and "Go to" step "Select Resource" from the queue;
2. Call redirect to other trigger numbers, and make 8 other scripts and applications for the different groups.
Generally the groups have the same call flow but just wonder which way will be recommended. Thank you.

 if your requirement is depending upon  user input just route the call to the respective CSQ who will assist the user , I would go for " CSQ " option. As its look simple. 

Similar Messages

  • What is the best way to distribute your swing application:

    I have developed a application that connects to data base and it containes the functionality to manage the invoices and receipts system, but i am scare to distribute it by building it to jar using the netbeans build option , it runs fine but when i extract the jar file back i m scare that any one can easily decompile it using JAD and get the database credentials , please tell me the best way to distribute the application so that reverse engineering the application will not be possible please tell me that :(

    Decompiling is unnecessary. All somebody would have to do would be to sniff the transmissions between your application and your database; the credentials are sent unencrypted.
    So the problem is not that somebody could find out the database credentials, the problem is that your database exposes itself to the internet. And if anybody does find the credentials through any method at all, then you've got a problem.
    And by the way if you distribute your application with the credentials hard-coded, then that makes it difficult for you to change the password if it does get compromised, because then nobody can use your application any more. This is a bad thing because one of the first things you should do when your system is compromised is to change the access password.
    So really the best way to distribute this application would be to write it so that it connects to an application which runs on your server. This server application would communicate with the database, which would make it unnecessary for the database to be visible from the internet. Your Swing application would communicate with the server application via some kind of web service protocol.

  • Best way to distribute LabVIEW Instruments Drivers.

    Hello,
    I'm trying to stick to the standards described just there:
    Developing LabVIEW Plug and Play Instrument Drivers
    Instrument Driver Development Tools and Resources
    Instrument Driver Guidelines
    However, I cannot see clearly what is the best way to distribute the LabVIEW Instruments Drivers, except that I need to be compliant with those standards to be on the IDNET (Instrument Drivers Network).
    Here is a couple of questions I'm not really sure about their answers:
    Is it okay to use a .NET dll and to make some calls?
    Is it fine to hide the Block Diagram? With passwords in the Drivers
    Can we put prevent the modification
    In my situation several devices (or let's say modules can controlled through one communication medium), so basically I might have one instrument drivers for several things.,,
    May I use some OOD? In order to control the equipment with methods and set some properties and encapsulate the communication stuff in classes, some equipments could be consider are inherited from others (more recent for instance), or this is totally prohibited by the guidelines above?
    What is the most suitable Specification Build type: Packet Library or other? If packed library, how to handle the early version created issue when opening the lib with newer version of LabVIEW then?
    How to deal with the copyright thing, do I need to copy and paste the copyright on both the Front Panel and on the Block Diagram?
    Is this copyright enough: "Copyright (c) <Company Name>. All rights reserved"?
    Solved!
    Go to Solution.

    Ehouarn wrote:
    Hello,
    I'm trying to stick to the standards described just there:
    Developing LabVIEW Plug and Play Instrument Drivers
    Instrument Driver Development Tools and Resources
    Instrument Driver Guidelines
    However, I cannot see clearly what is the best way to distribute the LabVIEW Instruments Drivers, except that I need to be compliant with those standards to be on the IDNET (Instrument Drivers Network).
    Here is a couple of questions I'm not really sure about their answers:
    Is it okay to use a .NET dll and to make some calls?
    Is it fine to hide the Block Diagram? With passwords in the Drivers
    Can we put prevent the modification
    In my situation several devices (or let's say modules can controlled through one communication medium), so basically I might have one instrument drivers for several things.,,
    May I use some OOD? In order to control the equipment with methods and set some properties and encapsulate the communication stuff in classes, some equipments could be consider are inherited from others (more recent for instance), or this is totally prohibited by the guidelines above?
    What is the most suitable Specification Build type: Packet Library or other? If packed library, how to handle the early version created issue when opening the lib with newer version of LabVIEW then?
    How to deal with the copyright thing, do I need to copy and paste the copyright on both the Front Panel and on the Block Diagram?
    Is this copyright enough: "Copyright (c) <Company Name>. All rights reserved"?
    Not sure about 1) but 2) and 3) are definitely a no go if you want your library to be distributable through the ID network. The standard only allows for DLLs in that  are really developed in C but would require you to also distribute the C code as far as I understand.
    OOP may not be a problem.
    Packed Library is definitely not something you want to do. They only work in the LabVIEW version in which they were created. You will hate the moment you decided to go with packed libraries as requests come in for other LabVIEW versions and your drivers are bashed on all discussion forums as being a pain in the ass to use.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • I have lightroom 5.7. Now I have apple TV to connect my Mac to the TV scree. I wish to do a slide show on the TV. However, on the Mac, using ITunes to share photos with the TV, I cannot locate all photo files. What is the best way to save the slide show a

    I have lightroom 5.7. Now I have apple TV to connect my Mac to the TV scree. I wish to do a slide show on the TV. However, on the Mac, using ITunes to share photos with the TV, I cannot locate all photo files. What is the best way to save the slide show and put it where the Mac sharing can find it? So far, I made on one folder in Lightroom, put some photos there and I found them easily. Can this be done with a slide show? Please help quickly! Also worried that the photos I put on the new folder are hard to find afterwards, when the show is done. Where do they go when I delete from from the new folder?I am not alone

    Not that I'm aware of. You just export JPEG copies to a folder that you can point iTunes to. For instance, I have created a folder in my Pictures folder called Apple TV. And within that folder I have other folders of pictures that I can choose from in iTunes to share with Apple TV. But there doesn't seem to be any way to share a Lightroom slideshow. If you have laid to create a video file that would probably work. Apple TV is a little clunky in my opinion. Some things are a little more difficult to do now than they were a while back. I probably haven't provided you with much help, but just keep experimenting and I think you will figure it out.

  • What is the best way to mimic the data from production to other server?

    Hi,
    here we user streams and advanced replication to send the data for 90% of tables from production to another production database server. if one goes down can use another one. is there any other best option rather using the streams and replication? we are having lot of problems with streams these days they keep break and get calls.
    I heard about data guard but dont know what is use of it? please advice the best way to replicate the data.
    Thanks a lot.....

    RAC, Data Guard. The first one is active-active, that is, you have two or more nodes accessing the same database on shared storage and you get both HA and load balancing. The second is active-passive (unless you're on 11.2 with Active Standby or Snapshot Standby), that is one database is primary and the other is standby, which you normally cannot query or modify, but to which you can quickly switch in case primary fails. There's also Logical Standby - it's based on Streams and generally looks like what you seem to be using now (sort of.) But it definitely has issues. You can also take a look at GoldenGate or SharePlex.

  • Best Way to port the data from one DB to another DB using Biztalk

    Hi,
    please suggest best way to move the data from one db to another DB using biztalk.
    Currently I am doing like that, for each transaction(getting from different source tables) through receive port, and do some mapping (some custom logic for data mapping), then insert to target normalized tables(multiple tables) and back to update the status
    of transaction in source table in sourceDB. It is processing one by one.
    How/best we we can do it using  bulk transfer and update the status. Since it has more than 10000 transaction per call.
    Thanks,
    Vinoth

    Hi Vinoth,
    For SQL Bulk inserts you can always use SQL Bulk Load
    adapter.
    http://www.biztalkgurus.com/biztalk_server/biztalk_blogs/b/biztalksyn/archive/2005/10/23/processing-a-large-flat-file-message-with-biztalk-and-the-sqlbulkinsert-adapter.aspx
    However, even though a SQL Bulk Load adapter can efficiently insert a large amount of data into SQL you are still stuck with the issues of transmitting the
    MessageBox database and the memory issues of dealing with really large messages.
    I would personally suggest you to use SSIS, as you have mentioned that records have to be processed in specific time of day as opposed to when the
    records are available.
    Please refer to this link to get more information about SSIS: http://msdn.microsoft.com/en-us/library/ms141026.aspx
    If you have any more questions related to SSIS, please ask it in
    SSIS 
    forum and you will get specific support.
    Rachit

  • Iphone 4s coming friday, what is the best way to get the notes content from iphone 4 to 4s without doing a restore? i want the new phone to be totally new but not sure how to get notes content across.

    What is the best way to get the notes content from iphone 4 to 4s without doing a restore? i want the new phone to be totally new but not sure how to get notes content across. If I do a restore as I have when previously from one iphone to another it has shown (in settings, usage) the cumulative usage from previous phones so all the hours of calls on all previous iphones will be displayed even though its brand new. Anyone know how I can get my notes (from standard iphone notes app) to my new iphone 4s without restoring from previous iphone 4. Thanks for any help offered.

    First, if you haven't updated to iTunes 10.5, please update now as you will need it for the iPhone 4S and iOS 5.
    Once you're done installing iTunes 10.5, open it. Connect your iPhone to iTunes using the USB cable. Once your iPhone pops up right click on it. For example: an iPhone will appear and it will say "Ryan's iPhone."
    Right click on it and select "Backup" from the dropdown menu. It will start backing up. This should backup your notes.
    Please tell me if you have any problems with backing up.
    Once you backup and get your iPhone 4S, you must follow these steps. If you don't follow these steps, you will not be able to get your notes on your new iPhone 4S.
    Open up iTunes again then right click on your device (iPhone 4S). Once you do you will see a dropdown menu. It will say "Restore from Backup..." Select this and it'll ask for a backup, select it from the dropdown menu. For example "Ryan's iPhone - October 12, 2011." Pick that and it will restore to your backup. Do this when you get your iPhone 4S so you will not lose anything. Even though you're restoring, you're getting back, since you're getting the previous settings, notes, contacts, mail and other settings from your old iPhone. You'll still have Siri though! So, restore when you first get it. Also frequently backup your device, as it will be worth it. You can restore from a backup if something goes wrong or save your data for a future update.
    Once you do that, you should have your notes on your new iPhone 4S and iOS 5.
    Please tell me if you need any help.
    I hoped I answered your questions and solved your problem!

  • Best way to Fetch the record

    Hi,
    Please suggest me the best way to fetch the record from the table designed below. It is Oracle 10gR2 on Linux
    Whenever a client visit the office a record will be created for him. The company policy is to maintain 10 years of data on the transaction table but the table holds record count of 3 Million records per year.
    The table has the following key Columns for the Select (sample Table)
    Client_Visit
    ID Number(12,0) --sequence generated number
    EFF_DTE DATE --effective date of the customer (sometimes the client becomes invalid and he will be valid again)
    Create_TS Timestamp(6)
    Client_ID Number(9,0)
    Cascade Flg vahrchar2(1)
    On most of the reports the records are fetched by Max(eff_dte) and Max(create_ts) and cascade flag ='Y'.
    I have following queries but the both of them are not cost effective and takes 8 minutes to display the records.
    Code 1:
    SELECT   au_subtyp1.au_id_k,
                                       au_subtyp1.pgm_struct_id_k
                                  FROM au_subtyp au_subtyp1
                                 WHERE au_subtyp1.create_ts =
                                          (SELECT MAX (au_subtyp2.create_ts)
                                             FROM au_subtyp au_subtyp2
                                            WHERE au_subtyp2.au_id_k =
                                                                au_subtyp1.au_id_k
                                              AND au_subtyp2.create_ts <
                                                     TO_DATE ('2013-01-01',
                                                              'YYYY-MM-DD'
                                              AND au_subtyp2.eff_dte =
                                                     (SELECT MAX
                                                                (au_subtyp3.eff_dte
                                                        FROM au_subtyp au_subtyp3
                                                       WHERE au_subtyp3.au_id_k =
                                                                au_subtyp2.au_id_k
                                                         AND au_subtyp3.create_ts <
                                                                TO_DATE
                                                                    ('2013-01-01',
                                                                     'YYYY-MM-DD'
                                                         AND au_subtyp3.eff_dte < =
                                                                TO_DATE
                                                                    ('2012-12-31',
                                                                     'YYYY-MM-DD'
                                   AND au_subtyp1.exists_flg = 'Y'
    Explain Plan
    Plan hash value: 2534321861
    | Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT         |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
    |*  1 |  FILTER                  |           |       |       |       |            |          |
    |   2 |   HASH GROUP BY          |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
    |*  3 |    HASH JOIN             |           |  1404K|   121M|    19M| 33178   (1)| 00:06:39 |
    |*  4 |     HASH JOIN            |           |   307K|    16M|  8712K| 23708   (1)| 00:04:45 |
    |   5 |      VIEW                | VW_SQ_1   |   307K|  5104K|       | 13493   (1)| 00:02:42 |
    |   6 |       HASH GROUP BY      |           |   307K|    13M|   191M| 13493   (1)| 00:02:42 |
    |*  7 |        INDEX FULL SCAN   | AUSU_PK   |  2809K|   125M|       | 13493   (1)| 00:02:42 |
    |*  8 |      INDEX FAST FULL SCAN| AUSU_PK   |  2809K|   104M|       |  2977   (2)| 00:00:36 |
    |*  9 |     TABLE ACCESS FULL    | AU_SUBTYP |  1404K|    46M|       |  5336   (2)| 00:01:05 |
    Predicate Information (identified by operation id):
       1 - filter("AU_SUBTYP1"."CREATE_TS"=MAX("AU_SUBTYP2"."CREATE_TS"))
       3 - access("AU_SUBTYP2"."AU_ID_K"="AU_SUBTYP1"."AU_ID_K")
       4 - access("AU_SUBTYP2"."EFF_DTE"="VW_COL_1" AND "AU_ID_K"="AU_SUBTYP2"."AU_ID_K")
       7 - access("AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd
                  hh24:mi:ss') AND "AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
           filter("AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND
                  "AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))
       8 - filter("AU_SUBTYP2"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
       9 - filter("AU_SUBTYP1"."EXISTS_FLG"='Y')Code 2:
    I already raised a thread a week back and Dom suggested the following query, it is cost effective but the performance is same and used the same amount of Temp tablespace
    select au_id_k,pgm_struct_id_k from (
    SELECT au_id_k
          ,      pgm_struct_id_k
          ,      ROW_NUMBER() OVER (PARTITION BY au_id_k ORDER BY eff_dte DESC, create_ts DESC) rn,
          create_ts, eff_dte,exists_flg
          FROM   au_subtyp
          WHERE  create_ts < TO_DATE('2013-01-01','YYYY-MM-DD')
          AND    eff_dte  <= TO_DATE('2012-12-31','YYYY-MM-DD') 
          ) d  where rn =1   and exists_flg = 'Y'
    --Explain Plan
    Plan hash value: 4039566059
    | Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT         |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
    |*  1 |  VIEW                    |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
    |*  2 |   WINDOW SORT PUSHED RANK|           |  2809K|   133M|   365M| 40034   (1)| 00:08:01 |
    |*  3 |    TABLE ACCESS FULL     | AU_SUBTYP |  2809K|   133M|       |  5345   (2)| 00:01:05 |
    Predicate Information (identified by operation id):
       1 - filter("RN"=1 AND "EXISTS_FLG"='Y')
       2 - filter(ROW_NUMBER() OVER ( PARTITION BY "AU_ID_K" ORDER BY
                  INTERNAL_FUNCTION("EFF_DTE") DESC ,INTERNAL_FUNCTION("CREATE_TS") DESC )<=1)
       3 - filter("CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND "EFF_DTE"<=TO_DATE('
                  2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))Thanks,
    Vijay

    Hi Justin,
    Thanks for your reply. I am running this on our Test environment as I don't want to run this on Production environment now. The test environment holds 2809605 records (2 Million).
    The query output count is 281699 (2 Hundred Thousand) records and the selectivity is 0.099. The Distinct values of create_ts, eff_dte, and exists_flg is 2808905 records. I am sure the index scan is not going to help out much as you said.
    The core problem is both queries are using lot of Temp tablespace. When we use this query to join the tables, the other table has the same design as below so the temp tablespace grows bigger.
    Both the production and test environment are 3 Node RAC.
    First Query...
    CPU used by this session     4740
    CPU used when call started     4740
    Cached Commit SCN referenced     21393
    DB time     4745
    OS Involuntary context switches     467
    OS Page reclaims     64253
    OS System time used     26
    OS User time used     4562
    OS Voluntary context switches     16
    SQL*Net roundtrips to/from client     9
    bytes received via SQL*Net from client     2487
    bytes sent via SQL*Net to client     15830
    calls to get snapshot scn: kcmgss     37
    consistent gets     52162
    consistent gets - examination     2
    consistent gets from cache     52162
    enqueue releases     19
    enqueue requests     19
    enqueue waits     1
    execute count     2
    ges messages sent     1
    global enqueue gets sync     19
    global enqueue releases     19
    index fast full scans (full)     1
    index scans kdiixs1     1
    no work - consistent read gets     52125
    opened cursors cumulative     2
    parse count (hard)     1
    parse count (total)     2
    parse time cpu     1
    parse time elapsed     1
    physical write IO requests     69
    physical write bytes     17522688
    physical write total IO requests     69
    physical write total bytes     17522688
    physical write total multi block requests     69
    physical writes     2139
    physical writes direct     2139
    physical writes direct temporary tablespace     2139
    physical writes non checkpoint     2139
    recursive calls     19
    recursive cpu usage     1
    session cursor cache hits     1
    session logical reads     52162
    sorts (memory)     2
    sorts (rows)     760
    table scan blocks gotten     23856
    table scan rows gotten     2809607
    table scans (short tables)     1
    user I/O wait time     1
    user calls     11
    workarea executions - onepass     1
    workarea executions - optimal     9
    Second Query
    CPU used by this session     1197
    CPU used when call started     1197
    Cached Commit SCN referenced     21393
    DB time     1201
    OS Involuntary context switches     8684
    OS Page reclaims     21769
    OS System time used     14
    OS User time used     1183
    OS Voluntary context switches     50
    SQL*Net roundtrips to/from client     9
    bytes received via SQL*Net from client     767
    bytes sent via SQL*Net to client     15745
    calls to get snapshot scn: kcmgss     17
    consistent gets     23871
    consistent gets from cache     23871
    db block gets     16
    db block gets from cache     16
    enqueue releases     25
    enqueue requests     25
    enqueue waits     1
    execute count     2
    free buffer requested     1
    ges messages sent     1
    global enqueue get time     1
    global enqueue gets sync     25
    global enqueue releases     25
    no work - consistent read gets     23856
    opened cursors cumulative     2
    parse count (hard)     1
    parse count (total)     2
    parse time elapsed     1
    physical read IO requests     27
    physical read bytes     6635520
    physical read total IO requests     27
    physical read total bytes     6635520
    physical read total multi block requests     27
    physical reads     810
    physical reads direct     810
    physical reads direct temporary tablespace     810
    physical write IO requests     117
    physical write bytes     24584192
    physical write total IO requests     117
    physical write total bytes     24584192
    physical write total multi block requests     117
    physical writes     3001
    physical writes direct     3001
    physical writes direct temporary tablespace     3001
    physical writes non checkpoint     3001
    recursive calls     25
    session cursor cache hits     1
    session logical reads     23887
    sorts (disk)     1
    sorts (memory)     2
    sorts (rows)     2810365
    table scan blocks gotten     23856
    table scan rows gotten     2809607
    table scans (short tables)     1
    user I/O wait time     2
    user calls     11
    workarea executions - onepass     1
    workarea executions - optimal     5Thanks,
    Vijay
    Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:17 AM
    Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:19 AM

  • I am moving from PC to Mac.  My PC has two internal drives and I have a 3Tb external.  What is best way to move the data from the internal drives to Mac and the best way to make the external drive read write without losing data

    I am moving from PC to Mac.  My PC has two internal drives and I have a 3Tb external.  What is best way to move the data from the internal drives to Mac and the best way to make the external drive read write without losing data

    Paragon even has non-destriuctive conversion utility if you do want to change drive.
    Hard to imagine using 3TB that isn't NTFS. Mac uses GPT for default partition type as well as HFS+
    www.paragon-software.com
    Some general Apple Help www.apple.com/support/
    Also,
    Mac OS X Help
    http://www.apple.com/support/macbasics/
    Isolating Issues in Mac OS
    http://support.apple.com/kb/TS1388
    https://www.apple.com/support/osx/
    https://www.apple.com/support/quickassist/
    http://www.apple.com/support/mac101/help/
    http://www.apple.com/support/mac101/tour/
    Get Help with your Product
    http://docs.info.apple.com/article.html?artnum=304725
    Apple Mac App Store
    https://discussions.apple.com/community/mac_app_store/using_mac_apple_store
    How to Buy Mac OS X Mountain Lion/Lion
    http://www.apple.com/osx/how-to-upgrade/
    TimeMachine 101
    https://support.apple.com/kb/HT1427
    http://www.apple.com/support/timemachine
    Mac OS X Community
    https://discussions.apple.com/community/mac_os

  • I have a iMac Desktop and MacBook Pro laptop. I generate my work related files on both of these machines. What is the best way to keep the files in these machines synchronized?

    I have a iMac Desktop and MacBook Pro laptop. I generate my work related files on both of these machines. After a few days or weeks, I have new files on some folders on either of the machines that is not on the other machine. What is the best way to keep the files in these machines synchronized?

    How did you transfer the files to the iMac.  If you exported the files out of the MB library using Kind = Current you should get the edited version.  Any other  option may not.
    If you want to keep the two libraries "synced"  any photos you want to move to the iMac should be added to an album, connect the two Mac with a LAN, Target Disk Mode,  Transferring files between two computers using FireWire, with WiFi. and use the paid version of  iPhoto Library Manager to copy that album from the MB library to iMac library.  It will also copy the original and edited versions, keywords, titles, etc.
    OT

  • What is the best way to keep the battery built-in last longer?

    what is the best way to keep the battery built-in in macbook pro last longer???..
    is it by plugged in as many times as we can..and discharge it to 80-90% a week..then charged it back?

    My favorite tip for longer battery life: Uninstall Flash

  • What is the best way to export the data out of BW into a flat file on the S

    Hi All,
    We are BW 7.01 (EHP 1, Service Pack Level 7).
    As part of our BW project scope for our current release, we will be developing certain reports in BW, and for certain reports, the existing legacy reporting system based out of MS Access and the old version of Business Objects Release 2 would be used, with the needed data supplied from the BW system.
    What is the best way to export the data out of BW into a flat file on the Server on regular intervals using a process chain?
    Thanks in advance,
    - Shashi

    Hello Shashi,
    some comments:
    1) An "open hub license" is required for all processes that extract data from BW to a non-SAP system (including APD). Please check with your SAP Account Executive for details.
    2) The limitation of 16 key fields is only valid when using open hub for extracting to a DB table. There's no such limitation when writing files.
    3) Open hub is the recommended solution since it's the easiest to implement, no programming is required, and you don't have to worry much about scaling with higher data volumes (APD and CRM BAPI are quite different in all of these aspects).
    For completeness, here's the most recent documentation which also lists other options:
    http://help.sap.com/saphelp_nw73/helpdata/en/0a/0212b4335542a5ae2ecf9a51fbfc96/frameset.htm
    Regards,
    Marc
    SAP Customer Solution Adoption (CSA)

  • Hi I would like to show a presentation written on my macbook air on my television what is the best way to connect the two ?

    Hi I would like to show a presentation written on my macbook air on my television what is the best way to connect the two ?

    Much depends on the available connectivity  of your television.
    If it has HDMI input available, all you will need is a minidisplayport to HDMI adapter, and an HDMI cable long enough to place the Mac in the desired proximity to the TV. The adapter is avaialble from the Apple Store online.
    Provided the HDMI input on the television supports sound, and the model of your MBA supports it as well, make suree that your adapter is also supportive of sound, otherwise you will need to run separate cabling from the headphone jack of the MBA to TV sound input or external speakers.
    This is a somewhat specific answer to a question without any requisite detail. You don't indicate the model or type of inputs on your TV, nor do you indicate the model/generation of MBA you have.

  • Whats the best way to get the server name in a servlet deployed to a cluster?

              Hi,
              I have a servlet in a web application that is deployed to a cluster, just
              wondering what is the best way to get the name of the node that the server is
              running on at run time??
              Thanks
              

              Please try to modify the following code and test for your purpose: (check Weblogic
              class document for detail)
              import javax.naming.*;
              import weblogic.jndi.*;
              import weblogic.management.*;
              import weblogic.management.configuration.*;
              import weblogic.management.runtime.*;
              MBeanHome home = null;
                   try{
                        //The Environment class represents the properties used to create
                             //an initial Context. DEfault constructor constructs an Environment
                             //with default properties, that is, with a WebLogic initial context.
                             //If unset, the properties for principal and credentials default to
                             //guest/guest, and the provider URL defaults to "t3://localhost:7001".
                             Environment env = new Environment();
                             //Sets the Context.PROVIDER_URL property value to the value of
                             //the argument url.
                             if(admin_url!=null){
                                  env.setProviderUrl(admin_url);
                                  //Sets the Context.SECURITY_PRINCIPAL property to the value of
                                  //the argument principal.
                                  env.setSecurityPrincipal(username);
                                  //Sets the value of the Context.SECURITY_CREDENTIAL property to
                                  //the value of the argument cedentials
                                  env.setSecurityCredentials(password);
                                  //Returns an initial context based on the properties in an Environment.
                                  ctx = env.getInitialContext();
                             }else ctx = new InitialContext();
                             home = (MBeanHome) ctx.lookup(MBeanHome.ADMIN_JNDI_NAME);
                             ctx.close(); //free resource
                             // or if looking up a specific MBeanHome
                             //home = (MBeanHome) ctx.lookup(MBeanHome.JNDI_NAME + "." + serverName);
                             DomainMBean dmb = home.getActiveDomain(); //Get Active Domain
                             ServerMBean[] sbeans = dmb.getServers(); //Get all servers
                             if(sbeans!=null){
                                  for(int s1=0; s1<sbeans.length; s1++){
                                       String privip = sbeans[s1].getListenAddress();
                                  sbeans[s1].getName();
                             sbeans[s1].getListenPort();
                                                 WebServerMBean wmb = sbeans[s1].getWebServer();
                   }catch(Exception ex){
              "Gao Jun" <[email protected]> wrote:
              >Is there any sample code? Thanks
              >
              >Best Regards,
              >Jun Gao
              >
              >"Xiang Rao" <[email protected]> wrote in message
              >news:[email protected]...
              >>
              >> Sure. You can use the Weblogic management APIs to query ServerBean.
              >>
              >>
              >> "Me" <[email protected]> wrote:
              >> >
              >> >Thanks for your reply, i was hoping to find a better way for example
              >> >a class in
              >> >weblogic API.
              >> >
              >> >Thanks
              >> >
              >> >"Xiang Rao" <[email protected]> wrote:
              >> >>
              >> >>Physical: InetAddress.getLocalHost().getHostName()
              >> >>Weblogic: System.getProperty("weblogic.Name");
              >> >>
              >> >>
              >> >>"Me" <[email protected]> wrote:
              >> >>>
              >> >>>Hi,
              >> >>> I have a servlet in a web application that is deployed to a
              >cluster,
              >> >>>just
              >> >>>wondering what is the best way to get the name of the node that
              >the
              >> >>server
              >> >>>is
              >> >>>running on at run time??
              >> >>>
              >> >>>Thanks
              >> >>
              >> >
              >>
              >
              >
              

  • What is the best way to get the new iPhone/ iPhone 5

    What is the best way to get the new iPhone/ iPhone 5 when it is released. I want to get the phone as quick as possible. If i order it on The first day of pre orders will i get it on the same day as the release in stores? I'm on an unlimited data plan with AT&amp;T. Because the new iPhone will most likely have 4 g data will I loose the unlimited plan?

    Apple has not made any announcement.  We are not allowed to speculate on this forum.

Maybe you are looking for

  • Why does calendar invite .ics have wrong Organizer?

    When I receive a calendar invite from an Icloud user into my Outlook on Windows, I click the Accept button in the email which takes me to an Icloud page showing that I accepted.  It has the correct Organizer (person who sent invite) and invitee (me)

  • Sending clip from Premiere to Photoshop

    I am looking for a way to send edited clips from a Premiere timeline directly into Photoshop to grade with Adobe Camera Raw (please don't tell me to use Speedgrade or Resolve, I like ACR and its what I'm used to). The problem is that I need to import

  • Maintain Condition Records in CRM

    Hello, In R/3 I use transactions VK11 / VK12 and VK13 to create, change and display condition records. I was wondering if there are similar transactions in CRM to maintain the condition records. (Because when I create a sales order in CRM I got the e

  • My old PC died. Cant get music to new iMac w/o old PC???

    Ive read as much as possible on the thousands of threads on here, but the only solution not involving my old PC is a program called senuti that doesnt recognize my 5th gen 30g iPod. I have 3000 family photos and 2500 songs in limbo on my PC formatted

  • Difference bw the data staging of generic and appli specefic data sources

    Hi,    Can anyone tell the difference between data staging of generic and appl specific data sources. Like we know that LO data stage in queued delta, update queue and BW delta queue, i want to know actually where the generic data stages before it is