Loading data via Collaborative Planning
Hi
Is it possible to load data in MRP Forecast tables using Collaborative Planning? If it's possible it'll be very usefull for me.
Thanks in advance,
Ricardo Cabral
Tank you Eddy, but it seems it does not work.
The error message received from DTW is:
- <BOM>
- <BOM>
- <BO>
- <AdmInfo>
<Object>66</Object>
<Version>2</Version>
</AdmInfo>
- <ProductTrees>
- <row>
<TreeCode>A00001</TreeCode>
<Quantity>1</Quantity>
<TreeType>iProductionTree</TreeType>
</row>
</ProductTrees>
- <ProductTrees_Lines>
- <row>
<Currency>USD</Currency>
<IssueMethod>im_Backflush</IssueMethod>
<ItemCode>A00006</ItemCode>
<Price>160</Price>
<PriceList>1</PriceList>
<Quantity>3</Quantity>
<Warehouse>01</Warehouse>
</row>
- <row>
<Currency>USD</Currency>
<IssueMethod>im_Backflush</IssueMethod>
<ItemCode>S10002</ItemCode>
<Price>160</Price>
<PriceList>1</PriceList>
<Quantity>3</Quantity>
<Warehouse>01</Warehouse>
</row>
</ProductTrees_Lines>
</BO>
</BOM>
<ErrorMessage>A00001 - Errore definito dall'applicazione o dall'oggetto</ErrorMessage>
</BOM>
Similar Messages
-
Unable to load data to Hyperion planning application using odi
Hi All,
When I try to load data into planning using odi, the odi process completes successfully with the following status in the operator ReportStatistics as shown below but the data doesn't seem to appear in the planning data form or essbase
can anyone please help
org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
File "<string>", line 2, in <module>
Planning Writer Load Summary:
Number of rows successfully processed: 20
Number of rows rejected: 0
source is oracle database
target account dimension
LKM SQL TO SQL
IKM SQL TO HYPERION PLANNING is used
In Target the following columns were mapped
Account(load dimension)
Data load cube name
driverdimensionmetadata
Point of view
LOG FILE
2012-08-27 09:46:43,214 INFO [SimpleAsyncTaskExecutor-3]: Oracle Data Integrator Adapter for Hyperion Planning
2012-08-27 09:46:43,214 INFO [SimpleAsyncTaskExecutor-3]: Connecting to planning application [OPAPP] on [mcg-b055]:[11333] using username [admin].
2012-08-27 09:46:43,277 INFO [SimpleAsyncTaskExecutor-3]: Successfully connected to the planning application.
2012-08-27 09:46:43,277 INFO [SimpleAsyncTaskExecutor-3]: The load options for the planning load are
Dimension Name: Account Sort Parent Child : false
Load Order By Input : false
Refresh Database : false
2012-08-27 09:46:43,339 INFO [SimpleAsyncTaskExecutor-3]: Begining the load process.
2012-08-27 09:46:43,355 DEBUG [SimpleAsyncTaskExecutor-3]: Number of columns in the source result set does not match the number of planning target columns.
2012-08-27 09:46:43,371 INFO [SimpleAsyncTaskExecutor-3]: Load type is [Load dimension member].
2012-08-27 09:46:43,996 INFO [SimpleAsyncTaskExecutor-3]: Load process completed.Do any members exist in the account dimension before the load? if not can you try adding one member manually then trying the load again.
Cheers
John
http://john-goodwin.blogspot.com/ -
Chart dataTipFunction and loading data via HTTPService
Hi everyone,
I have a problem and I hope someone is able to give me a hint.
I am using a chart with data points. At the moment I am using a datatipFunction in order to give each point a "tooltip".
Now I need the ability to load some data via httpservice and display this as tooltip instead of the original value while being over a point.
It is needede because of a lot of data and a continuous minimal change of its values. I don't want to reload every possible data. Only examined datatips should be updated.
mouse over datapoint --> tooltip: "please wait, while updating" --> httpservice finished --> tooltip: "new data xyz"
The datatip function only returns a string, that is displayed as tooltip. If the datatip function calls a httpservice, how can I update that tooltip text?
Any ideas?Hi everyone,
I have a problem and I hope someone is able to give me a hint.
I am using a chart with data points. At the moment I am using a datatipFunction in order to give each point a "tooltip".
Now I need the ability to load some data via httpservice and display this as tooltip instead of the original value while being over a point.
It is needede because of a lot of data and a continuous minimal change of its values. I don't want to reload every possible data. Only examined datatips should be updated.
mouse over datapoint --> tooltip: "please wait, while updating" --> httpservice finished --> tooltip: "new data xyz"
The datatip function only returns a string, that is displayed as tooltip. If the datatip function calls a httpservice, how can I update that tooltip text?
Any ideas? -
Problem loading data to Hyperion Planning app. from Oracle EBS using ERPi
I'm using ERPi to load data to Planning application from EBS.
I have configured Source and Target, created import formats, location, data load rules etc.
When I'm running data load rule it is showing that data is imported and exported sucessfully. I can see data in the right format in data load workbench. ODI shows process completed sucessfully. But still there is no data loaded in Planning.
Even AIF_HS_BALANCES staging view in ERPi repository shows correct data.
Can some one tell me what could be the problem.
Or where I can see logs for the data load step for Planning.
ODI operater logs are not very clear to me. Same for log tables in ODI work repository.Hi Tony
Apologies for the delay (my email spammed the notification that you had replied, helpful!!). When you say that you have done it but discourage accessing the tables directly how did you achieve this? Were you just careful which data intersections your selected from the tables (e.g. always base level data so it is stored not calculated)?
I have used the HFM relational tables a couple of times to make use of the metadata that they contain but not tried to get data before.
Regards
Stuart -
Error while Loading data Via Integrator
Hi,
I'm currently following the GettingStarted screencast series 3.0 in youtube.
In the 'loadData.grf', while i'm trying to load data's to data domain after configuring the 'Reformat' and 'ExHashJoin' I'm getting below error, can anyone kindly help me?
INFO [main] - *** CloverETL framework/transformation graph, (c) 2002-2013 Javlin a.s, released under GNU Lesser General Public License ***
INFO [main] - Running with CloverETL library version 3.3.0 build#036 compiled 12/02/2013 18:45:45
INFO [main] - Running on 4 CPU(s), OS Windows 7, architecture amd64, Java version 1.7.0_07, max available memory for JVM 668160 KB
INFO [main] - Loading default properties from: defaultProperties
INFO [main] - Graph definition file: graph/Copy of LoadData1.grf
INFO [main] - Graph revision: 1.17 Modified by: MANJU Modified: Tue Jul 23 08:08:54 IST 2013
INFO [main] - Checking graph configuration...
ERROR [main] - Graph configuration is invalid.
WARN [main] - [Join Dim on Facts:JOIN_DIM_ON_FACTS] - Input port 4 not defined or mapping has too many elements (number of input ports: 4)
ERROR [main] - [ResponseSurvey:RESPONSE_SURVEY] - At least 1 output port must be defined!
ERROR [main] - Error during graph initialization !
Element [1371098357416:LoadData]-Graph configuration is invalid.
at org.jetel.graph.runtime.EngineInitializer.initGraph(EngineInitializer.java:263)
at org.jetel.graph.runtime.EngineInitializer.initGraph(EngineInitializer.java:239)
at org.jetel.main.runGraph.runGraph(runGraph.java:377)
at org.jetel.main.runGraph.main(runGraph.java:341)
Caused by: org.jetel.exception.ConfigurationException: [ResponseSurvey:RESPONSE_SURVEY] - At least 1 output port must be defined!
at org.jetel.exception.ConfigurationProblem.toException(ConfigurationProblem.java:156)
at org.jetel.exception.ConfigurationStatus.toException(ConfigurationStatus.java:106)Hi,
I've managed to recreate the graph.But stuck with other error on the "Getting started" application.
This time I'm adding "Geography Dim" and "Reseller Dim" through ExHash join to "Join Dims to Fact".
I've got the below error
INFO [main] - *** CloverETL framework/transformation graph, (c) 2002-2013 Javlin a.s, released under GNU Lesser General Public License ***
INFO [main] - Running with CloverETL library version 3.3.0 build#036 compiled 12/02/2013 18:45:45
INFO [main] - Running on 4 CPU(s), OS Windows 7, architecture amd64, Java version 1.7.0_07, max available memory for JVM 668160 KB
INFO [main] - Loading default properties from: defaultProperties
INFO [main] - Graph definition file: graph/LoadData.grf
INFO [main] - Graph revision: 1.26 Modified by: MANJU Modified: Thu Jul 25 23:54:02 IST 2013
INFO [main] - Checking graph configuration...
INFO [main] - Graph configuration is valid.
INFO [main] - Graph initialization (LoadData)
INFO [main] - [Clover] Initializing phase: 0
INFO [main] - [Clover] phase: 0 initialized successfully.
INFO [WatchDog] - Starting up all nodes in phase [0]
INFO [WatchDog] - Successfully started all nodes in phase!
ERROR [WatchDog] - Graph execution finished with error
ERROR [WatchDog] - Node GEOGRAPHY_DIM finished with status: ERROR caused by: Component pre-execute initialization failed.
ERROR [WatchDog] - Node GEOGRAPHY_DIM error details:
Element [GEOGRAPHY_DIM:Geography Dim]-Component pre-execute initialization failed.
at org.jetel.graph.Node.run(Node.java:458)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.ArrayIndexOutOfBoundsException: 10
at org.jetel.data.parser.AhoCorasick.isPattern(AhoCorasick.java:182)
at org.jetel.data.parser.CharByteDataParser$ByteRecordSkipper.skipInput(CharByteDataParser.java:1594)
at org.jetel.data.parser.CharByteDataParser.skip(CharByteDataParser.java:264)
at org.jetel.util.MultiFileReader.skip(MultiFileReader.java:341)
at org.jetel.util.MultiFileReader.nextSource(MultiFileReader.java:286)
at org.jetel.util.MultiFileReader.preExecute(MultiFileReader.java:498)
at org.jetel.component.DataReader.preExecute(DataReader.java:237)
at org.jetel.graph.Node.run(Node.java:456)
... 1 more
INFO [WatchDog] - [Clover] Post-execute phase finalization: 0
INFO [WatchDog] - [Clover] phase: 0 post-execute finalization successfully.
INFO [WatchDog] - Execution of phase [0] finished with error - elapsed time(sec): 0
ERROR [WatchDog] - !!! Phase finished with error - stopping graph run !!!
INFO [WatchDog] - -----------------------** Summary of Phases execution **---------------------
INFO [WatchDog] - Phase# Finished Status RunTime(sec) MemoryAllocation(KB)
INFO [WatchDog] - 0 ERROR 0 27979
INFO [WatchDog] - ------------------------------** End of Summary **---------------------------
INFO [WatchDog] - WatchDog thread finished - total execution time: 5 (sec)
INFO [main] - Freeing graph resources.
ERROR [main] - Execution of graph failed !
kindly help. -
Column Mapping while loading data via tab delimited text file
While attempting to load data I get an error message at the "Define Column Mapping" step. It reads as follows:
1 error has occurred
There are NOT NULL columns in SYSTEM.BASICINFO. Select to upload the data without an error
The drop down box has the names of the columns, but I don't know how to map them to the existing field names, I guess.
By the way, where can I learn what goes in "format" at this same juncture.
Thanks!You can use Insert Into Array and wire the column index input instead of the row index as shown in the following pic:
Just be sure that the number of elements in Array2 is the same as the number of rows in Array1.
Message Edited by tbob on 03-07-2006 11:32 AM
- tbob
Inventor of the WORM Global
Attachments:
Add Column.png 2 KB -
HFM - log that shows if a user has loaded data via web form or excel load.
I can see any data loads that are coming from FDM, but is there a log that shows any data entered into HFM via web forms or submitted through an excel file? Any input is appreciated.
ThanksYou could enable Data Audit to capture data changes made by users, though this will not capture which method users chose to change the data. That is, HFM can show that data changed, and who changed it, but cannot tell whether the data was changed through a form, grid, smart view, or FDM. If you want to prevent users from changing data through forms, grids, or smart view, you can secure those input methods, but you cannot capture which one is used.
--Chris -
Hi,
From my reading of XML Db papers, it seems to indicate to me, that it is possible to load XML datafiles via ftp.
Please could someone give me an example of how this would be done, as I cannot find a concrete
example in the documentation.
Thanks for your attention.
PeteC:\>ftp
ftp> open localhost 2100
Connected to mdrake-lap.
220 mdrake-lap FTP Server (Oracle XML DB/Oracle9i Enterprise Edition Release 9.2
.0.1.0 - Production) ready.
User (mdrake-lap:(none)): scott
331 pass required for SCOTT
Password:
230 SCOTT logged in
ftp> cd public
250 CWD Command successful
ftp> mkdir test
257 MKD command successful
ftp> cd test
250 CWD Command successful
ftp> pwd
257 "/public/test" is current directory.
ftp> put c:\temp.txt temp.txt
200 PORT Command successful
150 ASCII Data Connection
226 ASCII Transfer Complete
ftp: 305 bytes sent in 0.00Seconds 305000.00Kbytes/sec.
ftp> get temp.txt -
200 PORT Command successful
150 ASCII Data Connection
Hello
This is a simple text file.....
It is stored in the Resource View, If it were a schema based XML file, and the
Schema had been registered with XML DB, then the Resource View would contain a
reference to a row stored in the default table identified by the XML Schema.
Does this help...
226 ASCII Transfer Complete
ftp: 305 bytes received in 0.03Seconds 10.17Kbytes/sec.
ftp> quit
221 QUIT Goodbye.
C:\>sqlplus scott/tiger
SQL*Plus: Release 9.2.0.1.0 - Production on Mon Jun 24 12:54:27 2002
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
SQL> set long 10000
SQL> select xdburitype('/public/test/temp.txt').getCLob() from dual;
XDBURITYPE('/PUBLIC/TEST/TEMP.TXT').GETCLOB()
Hello
This is a simple text file.....
It is stored in the Resource View, If it were a schema based XML file, and the
Schema had been registered with XML DB, then the Resource View would contain a
reference to a row stored in the default table identified by the XML Schema.
Does this help...
SQL> exit
Disconnected from Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
C:\> -
Edit in Excel option did not load data to financial plan
Hi
The financial plan (budgets) is being uploaded into oracle projects from Projects HTML responsibility using the Edit in Excel option. when the data is uploaded the status in Excel sheet shows as successful, but when the same is checked in front end, it is not imported. This is happening at random.
### Steps to Reproduce ###
open projects HTML resp and query for project
go to financial tab
select a working version of budget and select edit in excel
in the excel sheet that is opened, enter the required data and select oracle>upload from menu.
The status in sheet shows as successful. but on checking the working version, the same is not imported.
thanks
AbineshHi,
This seems to be an server issue, can you bounce the apache and try again ?
Thanks
Sathish Raju -
Unable to load data into Planning cube
Hi,
I am trying to load data into a planning cube using a DTP.
But the system throws a message saying that, real time data loading cannot be performed to the planning cube.
What would be the possible reason for the same?
Thanks & Regards,
Surjit PHi Surjit,
To load data into cube using DTP, it should be put in loading mode only.
The real time cube behaviour is set to planning mode only when the data is input to the cube via the planning layouts / or through the Fileupload program.
You can change the behaviour of the cube in RSA1 -> Right click on the cube -> Change Real-time load behaviour and select the first option(real time cube can be loaded with data; planning not allowed)
Best Rgds
Shyam
Edited by: Syam K on Mar 14, 2008 7:57 AM
Edited by: Syam K on Mar 14, 2008 7:59 AM -
Loading date/planned GI date fall in the past dates
Hi All,
we have an outbound delivery created with reference to a sales order, the delivery was created on 6/21, but we found that we have past date in this delivery as below:
- Transportation Planning 05/19
- Loading Date 05/19
- Planned GI 05/19
I go back to check the schedule line of one of the item in the sales order:
I got:
Delivery date 07/10
Goods issue date 07/10
Loading date 07/10
Material avail.date 06/21
Transportation Plan. Date 07/10
Why these date can not transfer from the SO to the outbound delivery? Please help and advice,
Thanks,check the copy control setting from order to delivery using tcode VTLA.
-
Planning 9.3.1 - Loading date values through HAL
Hi,
I am using HAL (VBIS) to load data into Hyperion Planning 9.3.1.
I am able to load all numeric & Text accounts. I have issue loading values into date fields. I tried different date formats as well as different data types in HAL. No luck.
Has anyone tried it? Any suggestions?
Appreciate your thoughts.
Thanks,
Ethan.Hi Ethan,
It is not HAL, but you might still find this useful: Re: Loading date type measure data from text file
Cheers,
Mehmet -
Short dump:ASSIGN_TYPE_CONFLICT- While loading data through DTP
Dear all:
We currently work with BI NW2004s SP10. We created transformation for mapping InfoSource and InfoCube based on 3.x transfer rule. For example, we used cube 0PUR_C04 and Data Source 2LIS_02_ITM_CP. And the transformation is "TRCS Z2LIS_02_ITM_CP -> CUBE 0PUR_C04". Everytime when we tried to load data via DTP. A runtime short dump occurred: ASSIGN_TYPE_CONFLICT
Error analysis:
You attempted to assign a field to a typed fie but the field does not have the required type.
we went back and forth to activated transformation and DTP. But still, same error occurred.
Any idea, please !!!!
BR
SzuFenHi Pavel:
Please refer to the following information-
User and Transaction
Client.............. 888
User................ "TW_S
Language key........ "E"
Transaction......... " "
Program............. "GPD0
Screen.............. "SAPM
Screen line......... 6
===========================================================
Information on where terminated
Termination occurred in the ABAP program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" - in
"EXECUTE".
The main program was "RSBATCH_EXECUTE_PROZESS ".
In the source code you have the termination point in line 704
of the (Include) program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
The program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" was started as a background job.
Job Name....... "BIDTPR_284_1"
Job Initiator.. "TW_SZU"
Job Number..... 16454800
===========================================================
Short text
Type conflict with ASSIGN in program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
===========================================================
Error analysis
You attempted to assign a field to a typed field symbol,
but the field does not have the required type.
===========================================================
Information on where terminated
Termination occurred in the ABAP program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" - in
"EXECUTE".
The main program was "RSBATCH_EXECUTE_PROZESS ".
In the source code you have the termination point in line 704
of the (Include) program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
The program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" was started as a background job.
Job Name....... "BIDTPR_284_1"
Job Initiator.. "TW_SZU"
Job Number..... 16454800
===========================================================
Line SourceCde
674 ELSE.
675 ASSIGN rdsTG_1->* to <_ys_TG_1>.
676 CLEAR <_ys_TG_1>.
677 MOVE-CORRESPONDING G1 TO <_ys_TG_1>.
678 <_ys_TG_1>-requid = l_requid.
679 l_recno_TG_1 = l_recno_TG_1 + 1.
680 ls_cross-insegid = 1.
681 ls_cross-inrecord = l_recno_SC_1.
682 ls_cross-outsegid = 1.
683 ls_cross-outrecord = l_recno_TG_1.
684
685 CALL METHOD i_r_log->add_cross_tab
686 EXPORTING
687 I_S_CROSSTAB = ls_cross.
688
689 ** Record# in target = sy-tabix - if sorting of table won't be changed
690 <_ys_TG_1>-record = l_recno_TG_1.
691 INSERT <_ys_TG_1> INTO TABLE <_yth_TG_1>.
692 IF sy-subrc <> 0.
693 CALL METHOD cl_rsbm_log_step=>raise_step_failed_callstack.
694 ENDIF.
695
696 ENDIF. "Read table
697 *
698 ENDIF.
699 CLEAR skipseg_all.
700 ENDLOOP.
701 * -
insert table into outbound segment -
702
703 <_yt_TG_1>[] = <_yth_TG_1>[].
>>>>>
705 rTG_1->insert_table( rdtTG_1_dp ).
706 ENDMETHOD. "execute
707
708
709
710 endclass. "lcl_transform IMPLEMENTATION
711
712 &----
713 *& Form get_runtime_ref
714 &----
715 * text
716 ----
717 * -->C_R_EXE text
718 ----
719 form get_runtime_ref
720 changing c_r_exe type ref to object.
721
722 data: l_r_exe type ref to lcl_transform.
723 create object l_r_exe.
===========================================================
Contents of system fields
Name Val.
SY-SUBRC 0
SY-INDEX 3
SY-TABIX 0
SY-DBCNT 1
SY-FDPOS 0
SY-LSIND 0
SY-PAGNO 0
SY-LINNO 1
SY-COLNO 1
SY-PFKEY
SY-UCOMM
SY-TITLE Execute Batch Process
SY-MSGTY E
SY-MSGID R7
SY-MSGNO 057
SY-MSGV1 0TOTDELTIME
SY-MSGV2 A
SY-MSGV3
SY-MSGV4
SY-MODNO 0
SY-DATUM 20070420
SY-UZEIT 164557
SY-XPROG SAPCNVE
SY-XFORM CONVERSION_EXIT
===========================================================
Active Calls/Events
No. Ty. Program Include
Name
6 METHOD GPD0QBVJ2WFQZZXBD0IJ1DSZAEL GPD0QBVJ2WFQZZXBD0IJ1DSZAEL
LCL_TRANSFORM=>EXECUTE
5 METHOD CL_RSTRAN_TRFN_CMD============CP CL_RSTRAN_TRFN_CMD============CM005
CL_RSTRAN_TRFN_CMD=>IF_RSBK_CMD_T~TRANSFORM
4 METHOD CL_RSBK_PROCESS===============CP CL_RSBK_PROCESS===============CM00Q
CL_RSBK_PROCESS=>PROCESS_REQUEST
3 METHOD CL_RSBK_PROCESS===============CP CL_RSBK_PROCESS===============CM002
CL_RSBK_PROCESS=>IF_RSBATCH_EXECUTE~EXECUTE
2 FUNCTION SAPLRSBATCH LRSBATCHU13
RSBATCH_EXECUTE_PROCESS
1 EVENT RSBATCH_EXECUTE_PROZESS RSBATCH_EXECUTE_PROZESS
START-OF-SELECTION
===========================================================
Thank you and BR
SF -
Problem while loading data from ODS to infoobject
Hi guys,
I am facing problem while loading data from <b>ODS to infoobject</b>.
If I load data via PSA it works fine but
if I load data without PSA its giving error as Duplicate records.
Do u have any idea why it is so.
Thanks in advance
savioHi,
when you load the data via the PSA, what did you select? Serial or Paralel?
If you select serial most likely you don't have duplicates within the same datapackage and your load can go through.
Loading directly in the IObj will happen thefore if you have the same key in two different packages, the "duplicate records" will be raised; you can perhaps flag your IPack with the option "ignore duplicate records" as suggested...
hope this helps...
Olivier. -
Trying to load data but only zeros are being loaded
Hello people, I'm facing the following situation
I'm trying to load data into a Planning application using a rule file and my Data source type is SQL. The problem is that all data is being replaced with zeros, so I only get zeros in the database.
I'm using version 11.1.2.1
All level zero members are store while all other members are dynamic calc (I'm trying to load only level zero members)
Coud you please help me?
Thanks in advance...Hello AceBase, thank you for your answer. It's very kind of you.
I'm overwriting data values. The data in the view is ok.
What is really strange for me is that when I see the data in the data rule (in EAS), the data is ok, but then I load the data and when I see it in Smart View all values are turn to zero.
I will try creating the rule file again.
Thank you very much.
Maybe you are looking for
-
Second screen for MacBook Pro with new IOS update?
I have a second screen for my MacBook Pro and after doing the IOS update it does not work. I went to preferences and it does not appear as an available display.
-
Stop Spotlight from indexing new drives
I know how to block a drive in Spotlight's privacy settings. What I'm looking for is a way to stop Spotlight from indexing a drive every time I plug one in. I'm often connecting external drives for work & spotlight immediatley starts to index & I go
-
Multiple results from AXL query in CUCM 6.1
Hi, I have an application that uses an AXL query to retrieve IP/Extension information from a Call Manager. For versions 4.x and 5.x it works fine: I get results back for my query (note that I send up to 200 elements per query). On 6.x however, I only
-
Any ideas?
-
Wave vs Windows Media File sizes
I installed iTunes on my Windows PC. I have a tune on my PC that is a Windows Media File that is 4626 kb. I imported the tune to iTunes and it converted it to Wave Sound and the file size is now 50,665 kb which is more than ten times bigger than the