DTW Warehouse load
In 2007A USA localization, we are trying to load warehouses using the oWarehouse object and get the message "One of the inventory accounts is missing 'Allocation Account'". However neither the template nor the Maintain Interface contains a column for 'Allocation Account'. We have tried "PurchaseAccount", "PurchaseOffsetAccount" and "PurchaseVarianceAccount" with no luck.
Does anybody have a template that works, or can you provide a mapping between the Warehouse Accounting tab and the oWarehouse columns?
Thanks, Jeff
It can be a little maddening. The nomenclature for error messages in DTW is inconsistent. One thing that helps is to cross-ref the field name in the error message with the description column for the table in the SDK.
In SDK, lookup warehouse table. It is OWHS. Look in description column for 'Allocation Account'. It is description for field 'TransferAc'. Closest match in template is 'TransfersAcc'.
Fun, huh?
Similar Messages
-
Performance issues with data warehouse loads
We have performance issues with our data warehouse load ETL process. I have run
analyze and dbms_stats and checked database environment. What other things can I do to optimize performance? I cannot use statspack since we are running Oracle 8i. Thanks
ScottHi,
you should analyze the db after you have loaded the tables.
Do you use sequences to generate PKs? Do you have a lot of indexex and/or triggers on the tables?
If yes:
make sure your sequence caches (alter sequence s cache 10000)
Drop all unneeded indexes while loading and disable trigger if possible.
How big is your Redo Log Buffer? When loading a large amount of data it may be an option to enlarge this buffer.
Do you have more then one DBWR Process? Writing parallel can speed up things when a checkpoint is needed.
Is it possible using a direct load? Or do you already direct load?
Dim -
Data warehouse Loader did not write the data
Hi,
I need to know which products are the most searched, I know the tables responsible for storing this information
and are ARF_QUERY ARF_QUESTION. I already have the Data Warehouse module loader running, if anyone knows
why the data warehouse loader did not write the data in the database, I thank you.
Thank.I have configured the DataWarehouse Loader and its components.Even I have enabled the logging mechanism.
I can manually pass the log files into queue and then populate the data into Data Warehouse database through scheduling.
The log file data is populated into this queue through JMS message processing" and should be automated.I am unable to
configure this.
Which method is responsible for adding the log file data into loader queue and how to automate this. -
Problem during Data Warehouse Loading (from staging table to Cube)
Hi All,
I have created a staging Module in owb to load my flat files to my staging tables.I have created an Warehouse module to load my staging tables to Dimension and Cube that I have created.
My senario:
I have a temp_table_transaction which had loaded my flat files to it .This table had loaded with 168,271,269 milion record as through this flat file.
I have created a mapping in owb which loaded my temp_table_transaction which has join with other tables and some expression and convert function that these numbers fill to a new table called stg_tbl_transaction in my staging module.Running this mapping takes 3 hours and 45 minutes with this configue of my mapping:
Default operation mode in running parameter of Mapp config=Set based
My dimesion filled correctly but I have two problem when I want to transfer my staging table to my Cube:
#1 Problem:
i have created a cube is called transaction_cube with owb and it generated and deployed correctly.
i have created a map to fill my cube with 168,271,268 milon recodes in staging table was called stg_tbl_transaction and deployed it to server (my cube map operating mode is set based)
but after running this map it did not complete after 9 hour and I forced to cancel my running's map by kill its sessions .I want to know this time for loading this capacity of data is acceptable or for this capacity of data we should spend more time.Please let me know if anybody has any Issue.
#2 Problem
To test my map I have created a map with configure set based in operation modes and select my stg_tbl_transaction as source with 168,271,268 records in it and I have created another table to transfer and load my data in it.I wanted to test the time we should spend on this simple map but after 5 hours my data had not loaded in new table.I want to know where is my problem.Should I have set something in configue of map or anothe things.Please guide me about these problems.
CONFIGURATION OF MY SERVER:
i run owb on two socket xeon 5500 series with 192 GB ram and disks with RAID 10 Array
Regards,
SaharFor all of you
It is possible to load from Infoset to Cube we did it, and it was ok.
Data are really loaded from Infoset (Cube + master dat) to cube.
When you create a transformation under a cube Infoset is proposed, and it works fine ....
Now the process is no more operationnal and i don't understand why .....
Load from infoset to cube is possible, i can send you screen shot if you want ....
Christophe -
Hi Guys,
Fairly new to this streams thing so hoping for some sound advice from any gurus out there.
We're looking to load two schemas from a production DB (9.2.0.8) into a data warehouse on 10.2.0.1 using streams, but have a few questions that may save me plenty of heartache.
Is it really possible to use downstream capture between 9.2.0.8 and 10.2.0.1? Any particular issues we'd be creating for ourselves?
Is it advisable to run the destination in archivelog mode, bearing in mind the next question....
In the event of a catastrophic failure at the warehouse end, how would I go about recovering and 'catching up' with transactions in the production DB?
Any advice regarding implementing the above planned setup would be gratefully received and appreciated. Thanks.Hi,
See the answers below:
--Is it really possible to use downstream capture between 9.2.0.8 and 10.2.0.1? Any
--particular issues we'd be creating for ourselves?
No, it is not supported. From documentation:
"Operational Requirements for Downstream Capture
The following are operational requirements for using downstream capture:
The source database must be running at least Oracle Database 10g and the downstream capture database must be running the same release of Oracle as the source database or later."
Ref:Oracle® Streams Concepts and Administration 10g Release 2 (10.2) Chapter 2
--Is it advisable to run the destination in archivelog mode, bearing in mind the next
--question....
--In the event of a catastrophic failure at the warehouse end, how would I go about
--recovering and 'catching up' with transactions in the production DB?
In downstream capture, you can always 'replay' the capture as long as you periodically run DBMS_CAPTURE_ADM.BUILD procedure to extract the data dictionary to the redo log on the source database.
I run this build twice a week, so in the event of losing the database, I can re-create the capture and extract the data I lost. Of course, you need to have a well documented and proven procedure to do this. Also, you need to keep the archived logs needed to do this.
--Any advice regarding implementing the above planned setup would be gratefully
--received and appreciated.
Since you cannot run downstream capture from 9i to 10g, you would have to implement a local capture at the source database(s) and then propagate the changes to the 10.2.0.1 database.
Aldo -
Import Itens - DTW - Warehouse Remove
How can i remove a warehouse that dont belong an item on DTW import, using template <oItems>?Any idea or have i to do a manual remove ?
ps:I can't to use <Locked> fieldMarcio,
I don't think thats possible through DTW. You could delete a Warehouse by going to Item Master ...Inventory Tab, highlight the warehouse... row right mouse click and delete row (CTRL+K)
In Administration > System Initialization > General Settings if you have checked
<b>Auto Add All Warehouses to New Items</b> Each time you create a new item SAP will automatically add it to all warehouses or when you add a new warehouse SAP will add it to existing items.
If you Uncheck this, SAP does not add the Items to any warehouse and like wise when you add a warehouse.
Suda -
Hi All,
I created Ware house after completed all the set up in database (assume 10 ware house)
now i need to update the warehouse for all item.
but in this scenario DTW will work or not?
if it works kindly help me
thanks
Anish AYes i will work,
Update all item by using Item Teplate with DTW and set the column(DefaultWarehouse) as DefaultWarehouse for the item .
run the DTW wizard
Thanks
Manvendra Singh Niranjan -
DTW Batch Load (B1 8.8)
Hi All,
I have gone thru most of the threads related to the issue of uploading Batches.
Header
DocNum DocType Handwrtten Printed DocDate CardCode Ref1 Ref1
500 dDocument_Items tNO psYes 20110303 BRI012 Old Data ukn01
Lines
ParentKey LineNum ItemCode Quantity Price
DocNum LineNum Item No. Quantity Price
500 0 BCRC-306 2.6 1.79
500 1 BF15500 40.5 2
Bathes
ParentKey BaselineNumber LineNum BatchNumber Notes Quantity
DocNum DocLineNum LineNum DistNumber Notes Quantity
500 0 0 P02506-84343 Test 2.6
500 1 1 P02879-95878 Test 2
500 1 0 P03437-108960 Test 3.5
500 1 1 P03437-108962 Test 5.5
500 1 2 P03437-109153 Test 12.5
500 1 3 P03437-109154 Test 4.5
500 1 4 P03437-109155 Test 12.5
When the load process is simulated, the target data is showing that only P02506-84343 & p)3437-109155 will be loaded.
Can someone explain what I am doing incorrect?
Regards
EarlHi All,
Problem solved.
Regards
Earl -
Error loading Valid Values for UDF via DTW
Hi,
SB1 8.81 PL09
I´m using DTW to load Valid Values in UDF.
Template CUFD - UserFieldsMD
TableName FieldID Name Description Type Size SubType EditSize DefaultValue
TableID FieldID AliasID Descr TypeID SizeID EditType EditSize Dflt
ACRD 21 Name Description A 25 25 Default Value
Template UFD1 u2013 ValidValuesMD
ParentKey ParentKey2 LineNum Value Description
TableName FieldID LineNum FldValue Descr
ACRD 21 0 Default Value Default Value
ACRD 21 1 Value 01 Value 01
ACRD 21 2 Value 02 Value 02
ACRD 21 3 Value 03 Value 03
ACRD 21 4 Value 04 Value 04
ACRD 21 5 Value 05 Value 05
ACRD 21 6 Value 06 Value 06
I get the following error message:
Key - Reason
ACRD,21 - CServiceData::VerifyPropertyWrite failed; Property u2018FielIDu2019 of u2018UserFieldMDu2019 is read only65171
Is any solution to work around this error?
ThanksTank you Eddy, but it seems it does not work.
The error message received from DTW is:
- <BOM>
- <BOM>
- <BO>
- <AdmInfo>
<Object>66</Object>
<Version>2</Version>
</AdmInfo>
- <ProductTrees>
- <row>
<TreeCode>A00001</TreeCode>
<Quantity>1</Quantity>
<TreeType>iProductionTree</TreeType>
</row>
</ProductTrees>
- <ProductTrees_Lines>
- <row>
<Currency>USD</Currency>
<IssueMethod>im_Backflush</IssueMethod>
<ItemCode>A00006</ItemCode>
<Price>160</Price>
<PriceList>1</PriceList>
<Quantity>3</Quantity>
<Warehouse>01</Warehouse>
</row>
- <row>
<Currency>USD</Currency>
<IssueMethod>im_Backflush</IssueMethod>
<ItemCode>S10002</ItemCode>
<Price>160</Price>
<PriceList>1</PriceList>
<Quantity>3</Quantity>
<Warehouse>01</Warehouse>
</row>
</ProductTrees_Lines>
</BO>
</BOM>
<ErrorMessage>A00001 - Errore definito dall'applicazione o dall'oggetto</ErrorMessage>
</BOM> -
Sharepoint 2013 and SSRS how to send reports on date schedule after dw load completes
Certainly with subscriptions we can generate a SSRS report on a schedule like say every Monday morning at 5 AM PT. My problem is I want to run those reports but I want to make sure the Datawarehouse completed its load first. Example if
for some reason the DW breaks at 4 AM and does not finish the load the Reports should not run. ONce the DW is finished then reports should run. The 5 AM is really a place holder for 1st attempt to send reports. It should keep trying until
it can send them or Tuesday comes around.
the only approach I can think of is via the DW with a job and stored procedure you could have it exec anything you want. Is it possible to exec the reporting services report from sql? Is there a way from within sharepoint?
Ken CraigHi Ken,
According to your you want to fire the SQL Server Reporting Services after the Data Warehouse load data completed, right?
By default, when a subscription is created, a corresponding SQL Server agent job is created meanwhile. The SQL Server agent job has the same schedule with the shared schedule or report-specific schedule that is using by the subscription.
The corresponding SQL Server agent job calls stored procedure AddEvent to add an event in SSRS. Then the SSRS notification service fetches the event from SSRS database to deliver the subscription.
That, we can configure regular shared schedule or report-specific schedule based on the irregular schedule. So in your scenario, you can configure the job steps to fire the subscription after the Data Warehouse load data completed. For the details, please
refer to the links below.
http://social.msdn.microsoft.com/Forums/en-US/32bc6d2d-5baa-4e27-9267-96a4bb90d5ec/forum-faq-how-to-configure-an-irregular-schedule-for-report-subscription?forum=sqlreportingservices
Regards,
Charlie Liao
If you have any feedback on our support, please click
here.
Charlie Liao
TechNet Community Support -
DTW serial numbers with stock take count
Has anyone worked out a way to use DTW to load serial numbers into the stock count sheets along with the item counts?
thanks Joseph but I am trying to do a regular monthly stock take, not a stock 'document' like a receipt or transfer.
The DTW template oStockTakings is what I have always used; this loads the stock counts into the "Initial Quantities, Stock Tracking and Stock Posting" screen. But I haven't tried to stock count serialised items before and want to know if there is some way to also load up the counted serial numbers, rather than having to intervene manually at the stock posting tab of the screen mentioned above.
kind regards,
Cathy -
Financial Analytics Implementation
Hi guys
I am looking for generic implementation plans for Financial Analytics to validate the assumption the product can be installed in around 3 months.
Any assistance would be greatly appreciated.This is one of the plan (partial), but hopefully you may have to change to suit to your needs.
Plan and Manage
- Establish project objectives and scope
- Prepare for the kickoff design session
Gather Business Requirements
Install and Configure Software
Install and Configure OBIEE Software
Test the environment
Install client software on developer laptops
Implement and configure Prebuilt BI Applications
-Configure common components of enterprise warehouse
-Setup Security Model and Access Control
-Create Metadata Schema's for DAC and Informatica
-Import Metadata into DAC and Informatica Schema's
Configure DAC Server
- Verify all dependancies of jobs are loaded
- Disable jobs not required
- Unit test DAC
Configure Informatica Server
- Verify all required mappings are loaded
- Verify all connections are established
- Unit test mappings and make necessary adjustments
- Create BAW Enterprise warehouse
- Load and Verify metadata
Configure Out-Of-Box (OOB) AR, AP, and GL Reports
-Load/Test subset of data into enterprise warehouse from EBS
-Assess effort to modify GAP
-Design Modifications
-Test the new ETL processes for accuracy and efficiency
-Load and test
Performance tune the Load and Reports
Reports
-Select/Modify Desired AR and AP Dashboard reports
-Validate that AR and AP data is correct in the OOB reports
-Review with client team and select reports useful
-Remove unwanted reports and make modifications per client
-Test the Dashbaord reports
-Obtain client signoff on the AR and AP reports and dashbaords
Deploy Data Warehouse Solution
-Migrate solution to Production and Validate the process
-Set up OBIEE Dashboard security
-Schedule data warehouse refresh processes
-Hold a pre-deployment demonstration for all beta users
-Train Beta users on OLAP/report usage
-Train I.T. personnel on deployment process
-Contingeny time for unexpected tasks
-Go live with initial group of users
-Post Implementation Support -
Allow for timezone shifting for Global Reporting.
Hi Everyone,
I would like to enable end users to be able to dynamically shift the timezone that data is being reported on. Before I go into the technical details, let me outline the business case.
Our Setup:
-- We are a global company and we have OBIEE users in many different timezones around the world.
-- But our Order Management system (Oracle EBS) processes every order in EST.
-- We pull the data into a warehouse and store the dates and times in EST.
The Issue:
-- Because of the timezone differences, it is difficult to understand Sales per day and especially per hour.
-- For example, Japan users are 14 hours ahead of us on EST right now, so their Saturday noon sales would show up as 10PM EST on Friday which is an odd hour to see a spike in a sales report.
I want to have a way to shift the timestamp, and date if necessary, dynamically based on a user selected prompt. This way the sales make sense for the local reporting team.
One of the ideas I came up with was having the warehouse loaded with multiple timestamp and date columns, but that seems like a brute force method because we'd need to add and maintain a bunch of columns for each date we want to do this to.
I'd prefer something more elegant than that method.
Does OBIEE 10g or 11g have features that will shift times and dates like this? Does anyone have any tricks for this?
Thanks!
-=Joe
Edited by: user9208525 on Oct 20, 2011 1:06 PMHi
The Personnel Area and Subarea are defined by country specific. If there is a same Personnel Subarea name for the different countries, from my point of view we have to present the personnal area for corresponding personnel Subarea which you need to address it.
Cheers
Jun -
[Special Prices for Business Partners] and [Hierarchies and Expansions]
Dear Sirs,
I am running into three problems.
1.) I loaded 313 items Special Prices by DTW however I can not locate where it went?
I searched via the path [Inventory>Price Lists>Special Prices>Special Prices for Business Partners]but find nothing in there.
2.) I can not find DTW templates for Hierarchies and Expansions data load.
Also I searched via the path [Inventory>Price Lists>Special Prices>Hierarchies and Expansions] to see if the Special Prices I loaded with DTW maybe there but find nothing there.
3.) I can not find any information related to Hierarchies and Expansions in the [diapi] help file.
Could it be these two functionalities require "manual" input?
If so, how on earth the 313 Special Prices DTW data load went through "successfully" and find no trace of it?
I am pulling my hair off to find some advice.
Would appreciate it greatly if someone could enlighten me on these two functionalities.
Kuni - Tokyo
Edited by: furuya kunitomo on May 15, 2009 2:28 PMHi Kuni,
To quickly answer your questions:
1.) I loaded 313 items Special Prices by DTW however I can not locate where it went?
The path is correct. If you can't find anything probably something went wrong in the DTW import.
Check that the "Test Run" checkbox was not checked and all required keys are entered in the templates.
2.) I can not find DTW templates for Hierarchies and Expansions data load.
The relevant templates are:
SpecialPrices.xlt
SpecialPricesDataAreas.xlt
SpecialPricesQuantityAreas.xlt
the default location when installing DTW is:
C:\Program Files\SAP\Data Transfer Workbench\Templates\Templates\oSpecialPrices
When entering Hierarchies and Expansions (renamed to Period and Volume Discount in Version 2007) you must enter *x for CardCode where x is the price list number.
3.) I can not find any information related to Hierarchies and Expansions in the diapi help file.
The information in the DI API file is under the SpecialPrices Object.
See below some general information regarding special prices:
SpecialPrices is a business object part of the Inventory and Production module under
Select Inventory > Price Lists > Special Prices > Special Prices for Business Partners.
OR
Select Inventory > Price Lists > Period and Volume Discount (Hierarchies and Expansions in previous versions)
Interesting points:
Source table: OSPP
DTW Template: SpecialPrices.csv
Mandatory fields in SAP Business One: CardCode and ItemCode.
PriceListNum is a foreign key to the PriceLists object - source table OPLN, field name ListNum.
CardCode is the Business Partner CardCode to enter Special Prices for Business Partners.
Child object of the SpecialPrices Object: SpecialPricesDataAreas (source table: SPP1)
DTW Template: SpecialPricesDataAreas.csv
LineNum (Field name LINENUM) - Always enter the appropriate LineNumber you want to update (starts from 0).
Child object of the SpecialPricesDataAreas Object: SpecialPricesQuantityArea (source table: SPP2)
DTW Template: SpecialPricesQuantityAreas.csv
LineNum (Field Name SPP2LNum). Always enter the appropriate LineNumber you want to update (starts from 0).
Hope that information helps a little. If you have any fruther questions enter the following information:
1. SAP Business One Version including Patch level
2. Do you get any error message after the import? What is the message returned?
Kind Regards,
Friederike Mundt
SAP Business One Forums Team -
Search in txt file using PL/SQL
Is there any way to search for a string in a text file using a PL/SQL block??
Thnks in advance.
AshishRichard:
It would depend on the nature of the text file, but it could be done for most plain text files. If there is some consistent delimiter, that is, the text file is fielded in some way, you could define an external table with the appropriate structure then use normal SQL queries against the particular fields from the file.
If the file is just plain text with no fields, you could create an external table something like:
CREATE TABLE myexternal (
line VARCHAR2(4000),
ORGANIZATION EXTERNAL
(TYPE oracle_loader
DEFAULT DIRECTORY mydirectory
ACCESS PARAMETERS
(RECORDS DELIMITED BY newline
BADFILE 'textfile.bad'
DISCARDFILE 'textfile.dis'
LOGFILE 'textfile.log'
(line CHAR(4000) NULLIF line = blanks))
LOCATION ('textfile.txt'))
REJECT LIMIT UNLIMITEDThen use normal sql to search the contents of that file.
I have not done any benchmarks on this, but my gut feel is that it would be significantly faster than using utl_file to loop over the lines looking for specific content.
In one of our warehouse load programs, we process a text file that is typically about 7Mb in under 30 seconds.
John
Maybe you are looking for
-
All my reminders have disappeared - how can I retrieve them?
I use the reminders app across 2 iMacs, an iPhone and an iPad. Everything has always synced beautifully, until now: all my personal reminders have disappeared (and there were a lot of them). On my home iMac, iPhone and iPad, the reminder icon shows
-
:( AT&T iphone stolen, found, returned damaged
Hi, I am at my wits end and wonder if anyone can suggest a solution.... last night my new iphone 3 gs was stolen from my car when I was on a late night run to the grocery store for milk. This morning a friend that works at the grocery store showed up
-
HT2506 How to set outlook as the default mail client for Preview
When I try to us Preview to email PDFs the Apple Mail client is the default mail client that opens. I use Lion and would like to set Outlook as my default. Does anyone know how to do this? It has been suggested that you click command-P, select PDF an
-
Hi, I know how to send an e-mail to someone registered in a database maintained with an apex-application, but now I want to store the message itself in my database. Or the other way: by typing the content in my application and sending it as message.
-
My boyfriend and I both have iMovie 11. He has a Mac Book Pro - I have an iMac...nearly twice the computer. He can import my MOV files from a UBS drive into iMovie and I can not. When I open the USB to import the files to my computer -- the files are