Is it possible to take the CDR data from a v4.2 Call Manager and copy it to a separate server where it would be made available for reporting?

Is it possible to take the CDR data from a v4.2 Call Manager and copy it to a separate server where it would be made available for reporting? We are not interested in migrating the CDR data to v6 because of the concerns it introduces to the upgrade process. Is it possible to get the raw data and somehow serve it from a different machine? (knowing it would be 'old' data that stops as of a certain date). If so, what would be the complexity involved in doing so?
It seems like the CDR data lives within MSSQL and the reporting interface is within the web server portion of the Call Manager... that's as far as we've dug so far.

Hi
It is absolutely possible to get the data - anyone you have in your org with basic SQL skills can move the data off to a standalone SQL server. This could be done most simply by backing up and restoring the DB using SQL Enterprise Manager.
Moving the CAR/ART reporting tool would be more difficult... if you do actually use that for reporting (most people find it doesn't do what they need and don't use it for anything but basic troubleshooting, and get a third party package) then the best option may be to keep your publisher (possibly assigning it a new IP) and leave it running for as long as you need reporting.
You would then need a new server to run your upgraded V6 CCM; you may find you need this anyway.
Regards
Aaron
Please rate helpful posts...

Similar Messages

  • Is it possible to move the master data from one server to onther server?

        is it possible to move the master data from one server to onther server?

    iOS: Transferring information from your current iPhone, iPad, or iPod touch to a new device

  • Can I take the hard drive from a late 2008 MacBook Pro and put it in an Early 2011 MacBook Pro?

    I just got a new 15" MBP and I want to give my older one to my son. The "older" one is an Early 2011.  My son has the Late 2008 model. My brother will take the 2008 MBP once I've moved all my son's stuff to the 2011.  Here's the question. My son has a 750GB hard drive on the 2008. He has partitions on it for Boot Camp and for his games. The 2011 only has a 500GB hard drive. Rather than erasing the 2011, restoring it from Time Capsule, then setting up Boot Camp and the other partitions all over again, or even using Winclone or something like that, can I just swap the hard drives? It would certainly faster and easier. I just wonder if there is a technical reason why I shouldn't do that. 

    You could always give it a shot - the only problem that you might run into is that the the early 2011 might not boot when the hard drive is swapped from the 2008 hard drive. Some versions of OS X are simply not compatible between different models.
    But it's worth a try...
    Clinton

  • I would like to import my bookmarks from IE but the 'Import Data from Another Browser' is greyed out and not an available choice.

    By following the directions on Firefox to open up the Library tool - import & backup - The drop down choice to 'Import Data from another Browser' this choice is greyed out and unavailable. I do not know how to proceed in order to import my favorites from IE.

    Make sure that you do not run Firefox in permanent Private Browsing mode.
    *https://support.mozilla.com/kb/Private+Browsing
    To see all History and Cookie settings, choose:
    *Tools > Options > Privacy, choose the setting <b>Firefox will: Use custom settings for history</b>
    *Deselect: [ ] "Always use private browsing mode"

  • Is it possible to get the element data type of a Queue from itself?

    Hi everyone,
    i have a Q that has a cluster as element data type.
    now when i want to enqueue
    i'll use bundle-by-name
    and for that i'd have to have my data-type present
    (long cable from whereever (possibly from where i obtained the Q).
    my question is,
    whether there is a method/property-node/something that allows me to
    wire the Queue Refnum into it and receive the element-data-type,
    so i can then input into the top of bundle-by-name?
    (i really dont want to have that cable all over the place)
    the reason i ask here is that
    the help for the outgoing Queue Refnum from the Obtain Queue method,
    shows the element-data-type and so i hope there might be a solution.
    thx for your time
    and cheers
    j
    Solved!
    Go to Solution.

    If I understand your question correctly, the answer is "Yes, it's very easy ..."
    The answer is "Preview Queue".  Here I create a Queue of some mysterious type (it's a cluster having a Thing and a Center, but you don't know that yet).  I take the Queue reference from whereever I can find it and pass it into Preview Queue Element.  I take the output and use it to define my cluster in Bundle by Name.
    Two caveats.  This copies the first element of the Queue into the cluster, so you probably need to be sure to define all of the elements of your cluster.  But what if the Queue is empty (as mine is, above, as I just Obtained it) -- well, that's why 0 is wired into the TimeOut input, since I do not want to wait "forever" (-1) for the empty Queue to have an element!  Turns out that even in this case, you still get the correct Cluster elements!
    Neat, huh?
    Bob Schor

  • LT01 - Is it possible to modifiy the posting date?

    I would like to transfer some quantity of a material from one storage bin to another storage bin. Is it possible to set the posting date? Form LT01 transaction is not possible to set the posting date. System takes the actual day.
    Thanks in advance.

    Hello,
    There is no work around to change the Posting date As i if try to move  Matrial between the Bins by changing the Dates it always pick the Current system date and Time.
    regards,
    rakesh
    Edited by: RAKESH ASHOK MANE on Sep 6, 2010 12:54 PM

  • Is possible to take the Infopath form with two repeat section in one SharePoint list

    Is possible to take the Infopath form  with two repeat section in one Sharepoint list 
    Take two repeat section and put them one bellow to other one in a SP list.
    The motive is that the first repeat section is based in account own by the requestor and the second repeat section is when the requestor is doing backup time for some one else where need to log the amount of time that spend in the peer account.
    I have basic logic in the form when requestor said Are you doing backup for some else? and press YES it is be able to use the second repeat section.
     Le me know how much pain full is going tobe or not..
    –Q1: Is possible to do this ?  With codeless or not
    –Q2:What steps I need to do to accomplish this?  feasible or not
    the following picture give a better idea of what I am looking to accomplish:
    CRISTINA&amp;amp MICROSOFT Forum

    Hi,
    Thank you for your question. I am trying to involve someone familiar with this topic to further look at this issue. There might be some time delay. Appreciate your patience. Thank you for your understanding and support.
    Thanks,
    Linda Li
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Linda Li
    TechNet Community Support

  • Is there any vi or possibility to find the hole dates in an directory and to list the data-pfads as a string?

    is there any vi or possibility to find the hole dates in an directory and to list the data-pfads as a string?
    like in windows a search-funtion?
    I have found one VI, it list the data in the directory self (the pfad from the data as string) but not the data in an under-directory. it also has the funktion to list me the under-directory but it do not list the complite data in every directory
    can somebody help me pleace
    THX

    Hallo Chef,
    unter LV8.21 befindet sich die beschriebene Funktion in der File-Palette:
    Dummerweise gibt es die Funktion in LV8 noch nicht... (Deshalb ist es immer eine gute Idee zu sagen, mit welcher LV-Verson man arbeitet!)
    Du kannst aber unter vi.lib/utility/RecursiveDirList.llb nachschauen, wie es gemacht wird
    Message Edited by GerdW on 08-17-2007 11:17 AM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome
    Attachments:
    AdvFileFunc.png ‏10 KB

  • I have lost my iPhone 5. Is it possible to load the backup data (including notes) from the iPhone backup to my iPad (air)?

    Hello I had an iPhone 5 that i have now lost. I'd like to access the backup for that phone, particularly the notes data. Is it possible to access the backup data via my iPad air?
    If not do i have to wait until i get my new iPhone then restore from the backup?
    if someone could point me in the right direction ASAP?
    many thanks

    Hello Jim
    just thought id let you know that i bought the iexplore software, downloaded it and solved my issues in less than 5 mins!
    great tip, thanks for your help.
    regards
    Charles

  • Is it possible to get the raw data?

    Hi,
    I am trying to do a program that sends to a server the data captured by a webcam, but I need to send it with java sockets.
    How can I get the raw data from a DataSource to send it myself?
    Thanks

    Hi
    It is absolutely possible to get the data - anyone you have in your org with basic SQL skills can move the data off to a standalone SQL server. This could be done most simply by backing up and restoring the DB using SQL Enterprise Manager.
    Moving the CAR/ART reporting tool would be more difficult... if you do actually use that for reporting (most people find it doesn't do what they need and don't use it for anything but basic troubleshooting, and get a third party package) then the best option may be to keep your publisher (possibly assigning it a new IP) and leave it running for as long as you need reporting.
    You would then need a new server to run your upgraded V6 CCM; you may find you need this anyway.
    Regards
    Aaron
    Please rate helpful posts...

  • How to get the complete data from Webdynpro using a RFC

    hi guys,
    A form is created in the webdynpro, when that pdf form is called the user have to provide the key field ie the employee number in the form and then when he press the 'go' button a rfc is called and it gives all the details of the employee in the form. Then the user have to input some fields in the form and he have press the submit button. My problem is here, when the user is going to press the submit button that form will be converted to a stream (binary) of data and it is sent to one of the import parameter used in the RFC.
    I have created a RFC and created a import parameter of data type XSTRING, since i want to see what exactly R3 is receiving from web dynpro i am writing this contant in the sever as a text file. When i saw that text file i cant see the complete data.
    And when i searched in R3 the capacity of xstring is 1024 CHAR, so i dont know how to capture the entire data from webdynpro into my R3.
    I give the code what i worte please tell me am i missing anything in my code, or is there any data type which can hold more than 500kb of data which is coming from webdynpro.
    FUNCTION ZSEND_MAIL_ATTACHMENT.
    ""Local Interface:
    *"  IMPORTING
    *"     VALUE(OUT_PLACE_LEVEL) TYPE  XSTRING OPTIONAL
    *"     VALUE(BIN_DATA) TYPE  INDX_CLUST OPTIONAL
    *"  TABLES
    *"      IT_MESSAGE STRUCTURE  SOLISTI1
    Data Declaration
      DATA: gd_cnt TYPE i,
          gd_sent_all(1) TYPE c,
          gd_error TYPE sy-subrc,
          tab_lines LIKE sy-tabix.
    Structure Declaration
      DATA : BEGIN OF it_file OCCURS 0,
              row(255),
             END OF it_file.
      DATA : BEGIN OF i_split OCCURS 0,
      row(50),
      END OF i_split.
    Internal Table Declaration
      data : it_receivers like table of SOMLRECI1 with header line."occurs 0.
      DATA : objbin LIKE SOLISTI1 OCCURS 0 WITH HEADER LINE. "sOLIX
      DATA : it_packing_list LIKE sopcklsti1 OCCURS 0 WITH HEADER LINE.
      data : wa_receiver like table of SOMLRECI1 with header line.
      data : it_receiver like table of SOMLRECI1 with header line.
      data : v_bin_data like SOLISTI1 occurs 0 with header line.
      DATA : gd_doc_data LIKE sodocchgi1 OCCURS 0 WITH HEADER LINE.
    *data bin_data1 like table of solix with header line.
      REFRESH : objbin, it_packing_list, it_receivers, wa_receiver.
      CLEAR   : objbin, it_packing_list, wa_receiver, it_receivers.
      DATA V_SUBJECT(255) VALUE 'HI'.
      gd_doc_data-doc_size = ( tab_lines - 1 ) * 255 + STRLEN( it_message ).
      gd_doc_data-obj_langu = sy-langu.
      gd_doc_data-obj_name = 'SENDFILE'.
      gd_doc_data-obj_descr = v_subject.
      gd_doc_data-sensitivty = 'O'.
      APPEND GD_DOC_DATA.
    Appending The Internal Table it_packing_list
      it_packing_list-head_start = 1.
      it_packing_list-head_num = 0.
      it_packing_list-body_start = 1.
      it_packing_list-doc_type = 'RAW'.
      it_packing_list-body_num = tab_lines.
      APPEND it_packing_list.
    *CALL FUNCTION 'GUI_UPLOAD'
    *EXPORTING
    *filename = V_FILE_PATH
    *filetype = 'BIN'
    *TABLES
    *data_tab = BIN_DATA.
      move bin_data to v_bin_data.
      append v_bin_data.
    *move soli to bin_data.
      LOOP AT V_BIN_DATA into objbin.
    MOVE v_bin_data TO objbin-line.
        APPEND objbin.
      ENDLOOP.
      CLEAR it_packing_list.
      DESCRIBE TABLE objbin LINES tab_lines.
      it_packing_list-transf_bin = 'X'.
      it_packing_list-head_start = 1.
      it_packing_list-head_num = 1.
      it_packing_list-body_start = 1.
      it_packing_list-doc_type = 'PDF'.
      it_packing_list-body_num = tab_lines.
      it_packing_list-doc_size = tab_lines * 255.
      APPEND it_packing_list.
    data file(255) value '/tmp/bali.txt'.
    *Appending The Internal Table it_receivers
    close dataset '/tmp/bali.txt'.
    open dataset '/tmp/bali.txt' for output in text mode encoding default.
    if sy-subrc = 0.
    loop at objbin.
    transfer objbin to '/tmp/bali.txt'.
    endloop.
    else.
    write 'hi'.
    close dataset '/tmp/bali.txt'.
    endif.
      it_receiver-receiver = '[email protected]'.
      it_receiver-rec_type = 'U'.
      it_receiver-com_type = 'INT'.
    APPEND wa_receiver.
    move wa_receiver[] to it_receiver[].
      append it_receiver.
    *Move wa_receiver[] to it_receivers[].
    Clear it_receivers.
    if i_OUT_PLACE_LEVEL NE 0.
    loop at it_receivers into wa_receiver.
       loop at it_receivers into wa_receiver.
    **Function Module To Post The Message To Externa Mail
         CALL FUNCTION 'SO_NEW_DOCUMENT_ATT_SEND_API1'
           EXPORTING
             document_data              = gd_doc_data
             put_in_outbox              = 'X'
             commit_work                = 'X'
           TABLES
             packing_list               = it_packing_list
             CONTENTS_TXT               = objbin
             receivers                  = it_receivers
           EXCEPTIONS
             too_many_receivers         = 1
             document_not_sent          = 2
             document_type_not_exist    = 3
             operation_no_authorization = 4
             parameter_error            = 5
             x_error                    = 6
             enqueue_error              = 7
             OTHERS                     = 8.
         clear wa_receiver.
    ENDFUNCTION.

    You have to convert your long string to a table of shorter strings.
    There may be other ways, but one possibility is to use a loop to process you string.
    while (there is something left)
       put the next e.g. 1024 characters in a new row of your table
    endwhile
    If you need to reconstruct your string from the table, don't use simple concatenation since it will remove blanks at the end of lines. Believe me (from experience) sooner or later this will happen.
    Instead you need to either set the subsections of your long string, or insert from the end of your table and keep shifting the contents (probably less efficient) right

  • Multiple reads of the same data from Multiprovider by Query (Bex)

    Hello, guys!
    We're having issue with performance of a query built on Multiprivider. During our investigation, we've found out that within one run of a Query, it several times refers to InfoProvider for the same data (see image attached).
    Do you have ideas what can be a reason for multiply reads of the same data from Multipvovider?

    Hello Nikita,
    By "copy of a query" i meant something like this as shown below :
    *Kindly click on the screenshot for a better view.
    1) See the highlighted portions below in the screenshot . See Query 2 highlighted and name of the BEx query highlighted.
    2) See the highlighted portions . See Query 3 highlighted and name of the BEx query highlighted.
    As you can see from the above screenshots i have used the same BEx query 2 times by the name of Query 2 & Query 3 . Infact i have not attached the complete screenhsot . In that i have used it 6 times.
    I have to analyze this a bit in detail but what i am guessing is that when this WEBi is called the single BEx is also called multiple times. And hence it hits the Info Provider multiple times resulting in a decreased performance.
    But this does not mean that this is wrong approach. There are various areas where you can improve for example :
    1) Either improve your BEx query if possible or use aggregates or something like that .
    2) Use the  Query stripping setting in WEBi so that unused dimensions and measures are not pulled resulting in an improved performance. It's switched on by default.
    Thanks!!
    Regards,
    Ashutosh Singh

  • How long does it take to transfer data from a PC to a Mac?

    Using Migration Assistant and I am transferring data from PC to IMac. Have no idea how ling this process should take. How long does it take to transfer data from a PC to a Mac?

    This may not be the best forum to get an answer.
    Depending on what is being/can be migrated it may be dependent on how much data is on the PC already.
    AC

  • Is it possible using SQLite to collect data from an older SQL database?

    Is it possible using SQLite to collect data from an older SQL database? Where can I find a possible answer. Thanks in advance.

    There are 3rd-party tools (see comprehensive list at http://www.kenhamady.com/bookmarks.html) that provide extra pdf functionality on top of the pdf export from Crystal. 
    In the case of my Visual CUT software, you can use hidden formulas inside your Crystal report to generate form fields (pre-populated as well as empty) as part of the pdf export process.
    hth,
    ido

  • Is there anyway to get netboot to take the netboot set from another drive?

    is there anyway to get netboot to take the netboot set from another drive? Also is there a limit to the amount for data in a neboot set?

    i've managed to put the netboot on another drive however the imacs won't recognise it on bott from network

Maybe you are looking for

  • Itunes duplicating songs that are the same. Multiple computers.

    On my ipod i have it set to manually update. I connect it to two computers, one is a G5, another is a custom PC. Now here's the problem, i'll drag and drop all of the music from the PC onto the ipod, it updates it, and we're all good. Well when i get

  • Unit of measurement error

    i hi guys crested po with unit of measure kg and also made migo,miro and RFQ  but now when i change unit of measurement from kg to pc in material master there is error the base unit of measure cannot be changed choose display errors 1 if there are no

  • My i pod nano first generation doesn't work with windows 8

    My i pod nano first generation doesn't work with windows 8

  • Linking to a framed page in a frameset

    Hi, I'm trying to create a hyperlink from my home page to a frame page within a frameset. The only way I can create a link thus far is by creating the hyperlink on another frame page within the same frameset. Here's the url to access the frameset. ht

  • Apple Mouse Slow - Anyway to speed up the tracking?

    I've been using a Mighty Mouse for a while which is fine, but one thing that annoys me is that its quite slow tracking across the screen with the tracking on FULL fastest.. I have a nice small nifty PC mouse, which on a Windows machine is very nifty