Enhancing package 'load transactional data for BW infoprovider UI'
Dear,
for project reasons, we would like to enhance the package ''load transactional data for BW infoprovider UI'. To be more precisely, we want to add some dimensions to the method of data transfer 'replace & clear datavalues'. Now the system clears the data that mirror each entity/category/time/datasource. With only those dimensions, we cannot be precise enought, so in our case the system clears too many data. We want to be able to extend this with some dimensions.
Is there any way this can be adapted ?
thx.
Hi Wouter,
for a more precise delete you should execute first a clear and after import using Merge option.
Kind regards
Roberto
Similar Messages
-
Error while loading transaction data from an Infoprovider Mapping C_ACCT
Hi Gurus,
We are implementing BPC Consolidation for NetWeaver 7.5, and we're facing the following situation:
We have loaded master data from 0RC_ACCOUNT (group account) to our dimension C_ACCT without a problem, but when we run the loading package from an infoprovider 0FIGL_C10 (Transactional Data), the following error message appear:
Dimension: Member C_ACCT: 000 not valid
We already check data in source InfoProvider and there's no record with 000 account value.
Any suggestions?
Thanks in advance
Best Regards
Abraham MéndezRad,
See if SAP note# 492647 & 849501 is of some help in your scenario. -
Loading transaction data from BW Infoprovider
Hello, I have a quick question. When you load data using this Data Manager Package, I see how many it accepted. But is there a way to check exactly what records were loaded?
Thanks.Hi,
While validate & Process transformation file you can see accepted records from log.
The other way is before loading data into BPC model from the back end (RSA1)
Step 1 -->change Load time behaviour from Data loading not allowed to Planning not allowed. So that if any request in yellow will be turned to Green. Then again change back to Data loading not allowed.
Step 2 -->Load your transaction data using package. Once data load is completed then go back to back end again and switch between the modes to convert the latest request from Yellow to Green as explained in step 1.
Now go to Info cube contents filter data based on latest request number to see records loaded using package.
Note : Please make sure no user sending data during data loading.
I hope it helps...
regards,
Raju -
Error while loading Transactional data from NW BW Infoprovider
Hi,
I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
Below is the transformation file content
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = TAB
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=NO
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 9999999999999
ROUNDAMOUNT=
*MAPPING
ACCOUNT = 0ACCOUNT
BUSINESSTYPE = *NEWCOL(NOBTYPE)
BWSRC = *NEWCOL(R3)
COMPANYCODE = 0COMP_CODE
DATASRC = *NEWCOL(R3)
FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
GAME = *NEWCOL(NOGAME)
INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
PROBABILITY = *NEWCOL(NOPROB)
PRODUCT = *NEWCOL(NOPROD)
PROFITCENTER = 0PROFIT_CTR
PROJECT = *NEWCOL(NOPROJECT)
TIME = 0FISCPER
VERSION = *NEWCOL(REV0)
WEEKS = *NEWCOL(NOWEEK)
SIGNEDDATA= 0AMOUNT
*CONVERSION
Below is the Error Log
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide=ZPCA_C01
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&L.xls
CLEARDATA= Yes
RUNLOGIC= No
CHECKLCK= Yes
[Messages]
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other cube
Application: ACTUALREPORTING Package status: ERROR
Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
ANy Inpout will be appreciated.
SanjayHello Guru,
Currently i am getting below error while loading costcenter master data from BW to BPC.
Task name MASTER DATA SOURCE:
Record count: 189
Task name TEXT SOURCE:
Record count: 189
Task name CONVERT:
No 1 Round:
Info provider is not available
Application: ZRB_SALES_CMB Package status: ERROR
Anybody can tell me, if i have missed anything ???
Regards,
BI NEW
Edited by: BI NEW on Feb 23, 2011 12:25 PM -
Load Transaction Data From InfoProvider
Hi expert,
Iu2019m trying to upload transaction data from an Infoprovider (using BPC 7.5 NW)
From datamanager package I can only select the name of the infoprovider, but I cannot filter the data (I need to upload only a subset of them)
The strange thing is that when validating the corresponding trasformation file, I can see the u201CSet Selectionu201D button where to make those selection. It is a bug that I canu2019t see it in the datamanager package?
Thanks & regards
FabiolaHi Fabiola,
for you 2 options are there.
1. Options section of transformation file using SELECTION (eg: SELECTION=0CALMONTH,200901)
2. Maintain conversion file, in that write conversion rules to skip records
(eg: external col: 2008, internal column: *skip )
SET SELECTION what you said is, useful while validating transfromation file against infoprovider, you can select this options in order to restriction of number of records to validate.
regards, -
Reg: Loading historic data for the enhanced field
Hello All,
We need to add a new field 0VENDOR to our datasource 0FI_GL_4. This field is available in our BSEG table. Hence, we are planning to go ahead with datasource enhancement.
Now, please advice on how to update the historical data to this newly added field.I have heard there is a BW functionality/program to do so without deleting the entire data. Kindly advice on the possible solutions.
Thanks & Regards
Sneha SanthanakrishnanHI Sneha,
Using remodeling option you will be able to do that, ie.. loading historical data for new attributes without deleting existing data. But the problem is in remodeling either you can assign constant or some other attribute value or values determined using EXIT.
Now when you are loading data from source system and if you need historical data as well the best practise is delete existing data and reload it from source sytsem.
But if you don't want to do that then I can give you one trick but not sure whether it will work or not. The idea is to populate the historical values for 0Vendor using customer exit option of remodeling. Now to get the historical values in customer exit you will need all that data in some BW table, here you can think of creating some generic extractor which will store the values of all the documents and the respective vendor, as you will load data form source system you will get historical values as well.
Now read that table in customer exit and populate vendor value. This will be one time process to populate the historical values.
Regards,
Durgesh. -
Can i load transactional Data from ge DS without having load M Data before
Hello,
I want to load transactional data from a generic data source.
I just want to know whether it would be successful, after the master data of 0material i load in the PSA couldn't be loaded in the infoObject (000000XXXXXXX -> material number error) by the DTP (see Thread BI - BI general - User arnaud).
If i can load the transactional data without having loaded the master data in the info Object, is it possible that it loads also the master data simultaneous.
Thank you very much for helping
AR.Loading transactions before master data can result in data elements not being populated in the InfoProvider in some cases, hence the "Best Practice" recommendation to load master data first.
For example, if you have update/transformation rules that go read master data to get additional data to enhance the transactional data, or you map master data attributes during the transaction load, the master data attributes could be missing values if you don't load master data first. -
"Error occurs when loading transaction data from other model" - BW loading into BPC
Hi Experts,
I'm having a problem with my data loading from BW, using the standard Load InfoProvider Selections data manager package.
If I run for a period without data it succeeds (with warning) but if there is data to be extracted I get the following error:
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other model
model: AIACONS. Package status: ERROR
As it runs ok when there isn't data it appears there is something preventing the movements of data out of the cube itself, rather then a validation issue.
Has anyone encountered similar or have any ideas as to the problem?
Best,
ChrisHi Vadim,
It's not specific to the transformation file as I have tried with others for the same BW cube and get the same result.
We get a warning when we try and validate the transformation file:
"Error occurs when loading transaction data from other model".
This only appears in the validation pop up and doesn't throw up any warnings about the transformation file itself. The validation log says:
Validate and Process Transformation File Log
Log creation time
3/7/2014 16:09
The result of validation of the
conversion file
SUCCESS
The result of validation of the
conversion file with the data file
FAIL
Validation Result
Validation Option
ValidateRecords = NO
Message
Error occurs when loading transaction data from other model
Reject List
I can't find any errors anywhere else.
Best,
Chris -
UJD_Test_Package - Load Transaction data fails
Hi All,
We have a working scenario for automating Master data with UJD_TEST_PACKAGE but the same is not working for Load Transaction data from Infoprovider.
Below Variant for Master data is working fine, but not for transaction data. However when we ran this package manually , it ends successfully.
When triggered from UJD_Test.. it shows accept count positive , but nothing is written BPC or data doesnt change. Package status shows Failed
Thanks,
KrishnaHi All,
We found the issue. We have extra characters in Answer Prompt which is giving this issue.
Not sure if something is changing the prompt after we saved it , but yes this issue is resolved.
Thanks,
Krishna -
Load transaction data from ECC to BPC 10.1 using US GAAP starter kit SP3
I need to understand a bit more the process to load transactional data into BPC 10.1. We have US GAAP Starter Kit SP 3. Below is an screenshot from the config guide:
It explains how transactional data can be extracted from ECC system to the SAP Netweaver BW data source 0FI_GL_10 and then transform it
to get it loaded into BPC. The Objects /PKG/FC_C01 and PKG/FC_DS01 are just mentioned here as reference because they come in the RDS for Financial Close and Disclosure Management, which not all companies have.
I believe the upwards data flow should be from Data Source 0FI_GL_10 to Data Store Object 0FIGL_O10 and then to InfoCube 0FIGL_R10. There is also a data flow that goes from 0FI_FL_10 to InfoCube 0FIGL_R10. Can anyone with experience with US GAAP starter kit answer this?
Thank you.Hello, we were able to load actuals to our environment with the US GAAP Starter Kit, SP03. I followed the Operating Guide document up to section 5.2 and ran the Consolidation Data Manager Package with no issue. We are using the A20 Input and A20 Consolidation process flows based on flows F00, F10, F20, F30, F99, etc... According to the documentation, the Statemnet of Cash Flow and Changes in Equity should be automatically calculated and available from the changes
in Balance Sheet accounts once Consolidation is completed . However, when I ran the corresponding reports, they bring no data.
We loaded actual data for the whole 2013 and Jan 2014. Our intention is
to run the first consolidation for Jan 2014. The closing balance for period 12.2013 in flow 99 was copied to flow 00 2014 (opening balance). Flows 20 and 30 were updated with the corresponding balance sheet movements (increase/decrease), according to the delivered controls in the starter kit. However, cash flow is still showing no results.
I found the following text in the operating guide, but I am not clear if I am missing a step. Can you please clarify? This is telling me that I need to copy my 01.2014 opening balance (F00) to 12.2013 closing balance (F99, which is done by the copy opening data manager package, but in the opposite direction, from Y-1 F00 to Y F99) and in addition to also copy that balance to 12.2013 F00 (previous year opening balance)??
"5.2.2 First Consolidation
When operating the consolidation for a given Scope for the first time in the application, it is necessary to populate and process the prior year-end time period for this Scope, in addition to the first required consolidation period. This is because the flows dedicated to the scope change analysis are properly populated by the consolidation engine provided that at least one automatic journal entry has been detected in the consolidation used as opening consolidation, which by default is the prior year end (December, Y-1). To accomplish this, it is recommended that you copy all input-level opening data of the first required consolidation (flow “F00”) into the closing data (“F99”) and opening data (“F00”) of the prior year-end time period, including intercompany breakdowns.
This breakdown by intercompany in the balance sheet triggers automatic eliminations, which are carried over to the closing position. After the consolidation process, the closing data of the prior year
end time period will contain the appropriate source data for the opening data of the following consolidation, notably the opening automatic journal entries. -
BPC 7.5 NW: Loading transaction data from infocube
Hello, I am trying to load transaction data from an infocube into a BPC application using a package based on /CPMB/LOAD_INFOPROVIDER.
The master data (cost center and cost element) are already loaded. As they are compounded, we have added in the key the CO area (eg ID=0CO_AREA+ID)
I am facing an issue that the system ads leading 0 in front of the values of those master data comming from the cube.
So the data looks like CO_AREA00000000000cost_center.
As the data where correctly loaded in the dimensions (CO_AREAcost_center), the loading fails.
How can we remove those 'O'?
Thanks!Hi Anton,
We use this in most of our projects; since auditing is one of the important requirements.
You should enable the auditing for selective categories only. With each transaction data entered, it will keep a log, which can increase the size a lot, and hence impacting the performance.
If you are enabling the log, you should frequently purge the audit data as well. However, you should take a backup of this and then purge it.
Hope this helps. -
Hi,
Can anyone share how to load transaction data from BI Cube to BPC Cube. How do i map key figures in BI to BPC in NW version. Pls share any document that explains.
Thanks
VeeruFor example, I've a BI Actual Cube with two key figures quantity and amount. I would like to copy this data to a BPC cube. Since BPC cube has only signed date (one key figure), how do we transform the data from BI cube to BPC. Can we able to achive this using transformation and conversion file in NW. Is SAP providing standard packages that convert KF's in BI cube to BPC dimension members.
Whats the approach you are doing for the scenarios where you are converting multiple key figures into BPC dimensions. I did not find a good document that explains this process.
Thanks for sharing any information.
Regards
Veeru -
Deleting master data after loading transactional data using flat file
Dear All,
I have loaded transaction data into an infocube using a flat file . While loading DTP i have checked the option "load transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
While loading the flat file, I made a mistake for DIVISION Characteristic where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
But when I see the master data for DIVISION , i can see a new entry with value '4000'.
My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
Please suggest me on this.
Regards,
VeeraHi,
Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
If this master data is not used any where else just delete the master data completely with SID option.
If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
Hope this helps
Akhan. -
Upload transaction data for transactions like KSU2/KSU5,KB31N
We are trying to upload transaction data for transactions like KSU2/KSU5,KB31N etc using batch data processing within LSMW. We are facing couple of problems as mentioned:<br />
We have to upload data, which has a single header record with 200-300 line items record. How to define target structures to represent header & line items. Currently we are able to upload header data with 1-2 line items depending upon the recording. The present recording gives us option to load 1 line item per document. <br />
Can excel be used instead of text file for uploading data in LSMW? <br />
During processing of transaction, User clicks on button and checks a value. Based on this value the future course of action is determined. How can we do the BDC recording or do customization (Write a custom code) <br />
If errors occur during upload of data, will the processing stop at that line item or will it ignore the error record and continue processing of the remaining data. <br />
How can we write custom code to modify the data during BDC session and not before session? <br />
Immediate pointers to this would be very useful. Let me know if more information is required.<br />
<br />
Thanks,<br />
Mehfuzehi ali,
as for your question related to excel upload via lsmw , please find the below links to be usefull
[Upload of excel file in LSMW]
[Upload excel by LSMW for FV60 ?]
[Upload Excel file through LSMW]
[How upload an excel file using an LSMW.]
regards,
koolspy. -
Hello,
EBS version : 11.5.10.2
DB version : 11.2.0.3
OS version : AIX 6.1
As a part of 11.5.10.2 to R12.1.1 upgrade, while applying merged 12.1.1 upgrade driver(u6678700.drv), we got below error :
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ATTENTION: All workers either have failed or are waiting:
FAILED: file glsupdas.ldt on worker 3.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 adwork003.log
Restarting job that failed and was fixed.
Time when worker restarted job: Wed Aug 07 2013 10:36:14
Loading data using FNDLOAD function.
FNDLOAD APPS/***** 0 Y UPLOAD @SQLGL:patch/115/import/glnlsdas.lct @SQLGL:patch/115/import/US/glsupdas.ldt -
Connecting to APPS......Connected successfully.
Calling FNDLOAD function.
Returned from FNDLOAD function.
Log file: /fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log/US_glsupdas_ldt.log
Error calling FNDLOAD function.
Time when worker failed: Wed Aug 07 2013 10:36:14
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 US_glsupdas_ldt.log
Current system time is Wed Aug 7 10:36:14 2013
Uploading from the data file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt
Altering database NLS_LANGUAGE environment to AMERICAN
Dumping from LCT/LDT files (/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0), /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt) to staging tables
Dumping LCT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0) into FND_SEED_STAGE_CONFIG
Dumping LDT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt into FND_SEED_STAGE_ENTITY
Dumped the batch (GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS , GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS ) into FND_SEED_STAGE_ENTITY
Uploading from staging tables
Error loading seed data for GL_DEFAS_ACCESS_SETS: DEFINITION_ACCESS_SET = SUPER_USER_DEFAS, ORA-06508: PL/SQL: could not find program unit being called
Concurrent request completed
Current system time is Wed Aug 7 10:36:14 2013
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Below is info about file versions and INVALID packages related to GL.
PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG is invalid with error component 'GL_DEFAS_DBNAME_S' must be declared.
I can see GL_DEFAS_DBNAME_S is a VALID sequence accessible by apps user with or without specifying GL as owner.
SQL> select text from dba_source where name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG') and line=2;
TEXT
/* $Header: glistdds.pls 120.4 2005/05/05 01:23:16 kvora ship $ */
/* $Header: glistddb.pls 120.16 2006/04/10 21:28:48 cma ship $ */
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
SQL> select * from all_objects where object_name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG')
2 ; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1118545 PACKAGE 05-AUG-13 05-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1118548 PACKAGE 05-AUG-13 06-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1128507 PACKAGE BODY 05-AUG-13 06-AUG-13 2013-08-06:12:56:50 INVALID N N N 2
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1128508 PACKAGE BODY 05-AUG-13 05-AUG-13 2013-08-05:19:43:51 VALID N N N 2
SQL> select * from all_objects where object_name='GL_DEFAS_DBNAME_S'; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
GL GL_DEFAS_DBNAME_S 1087285 SEQUENCE 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
APPS GL_DEFAS_DBNAME_S 1087299 SYNONYM 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
SQL> conn apps/apps
Connected.
SQL> SELECT OWNER, OBJECT_NAME, OBJECT_TYPE, STATUS
FROM DBA_OBJECTS
WHERE OBJECT_NAME = 'GL_DEFAS_ACCESS_SETS_PKG'; 2 3 OWNER OBJECT_NAME OBJECT_TYPE STATUS
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE VALID
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE BODY INVALID SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE; Warning: Package altered with compilation errors. SQL> show error
No errors.
SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE BODY; Warning: Package Body altered with compilation errors. SQL> show error
Errors for PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG: LINE/COL ERROR
39/17 PLS-00302: component 'GL_DEFAS_DBNAME_S' must be declared
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>cat $GL_TOP/patch/115/sql/glistdab.pls|grep -n GL_DEFAS_DBNAME_S
68: SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
81: fnd_message.set_token('SEQUENCE', 'GL_DEFAS_DBNAME_S');
SQL> show user
USER is "APPS"
SQL> SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
FROM dual; 2 -- with GL.
NEXTVAL
1002
SQL> SELECT GL_DEFAS_DBNAME_S.NEXTVAL from dual; --without GL. or using synonym.
NEXTVAL
1003
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdab.pls|grep '$Header'
REM | $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ |
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdas.pls |grep '$Header'
REM | $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ |
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */
Maybe you are looking for
-
How to find out the vector images in pdf file?
I need to extract all vector(line art) images from pdf file. Is it possible to find out whether the pdf contains vector image or not through programmatically? Kindly advice me. Thanks, Prabudass
-
How do I get my contacts and calendars from Icloud to outlook?
I am sure this question has been asked before, but I cannot find the answer. My hard drive on my PC crashed recently. I installed a new one and downloaded my files. I have my mail working fine, but cannot figure how to get my calendars and contacts f
-
New bug with 1.1.2 - video bookmarking
Since the latest updater release (June 28th 2006) the iPod 5G with software 1.1.2 no longer remembers the playback position of videos. Even though "Remember playback position" is checked for the file in iTunes, the iPod now ignores this with software
-
Style Sheets in WAD 7.0
Hi Guys, i would change the Style for an Query in a WebTemplate. Where can i change the Look and feel for a Query in a Web Template. Thnx. Dirk
-
We are preparing to transport our queries to our Q box. My question is do we need to generate the queries in RSRT. What does that Generate button actually do ?