Partition by Data Packet
I had problem during table conversion tcode se14.
I corretced the error by Changing the storage Parameters to NO partioning.
Does the cube gets paritioned by data Packet by default.
Regards,
Anita
The F fact table is partitioned on the Request which is relaed to the packets. The E fact table may be partitioned on 0FISCPER or 0CALMONTH which may or may not also be an issue here if he E fact table had been partitioned.
The partitioning of the F fact table is designed to support loading,e.g. each Request always loads to new partition. This partition then gets dropped when the Request is compressed.
You may very well have failures when to try to perform Compresses or back out Requests. I think you will have performance problems anytime the Request needs to be deleted, either because you want to back it out or it has been compressed. Your compress time will probably be slower if you only compress selected Request, e.g. Request more than 10 days old. I'm not intimately familiar with the actual SQL and DB options used when loading the Request, but it is reasonable to assume that BW is assuming that it is inserting into an empty partition and optmized accordingly.
I don't know if the BW will be smart enough to figure out the the F fact table is not partitioned and adjust accordingly, but you will certainly suffer on the performance side.
Similar Messages
-
Hi
i have two questions
1) where can i set the data packet size in BW and R3 and also in BI how and where can we set .
2)by what method/logic we will select the key fields in dso.
ex: i have 5 tables in the sourse and each table will have primary keys, now how do we know that particular primary keys should be kept in KEY FIELDS folder in DSO.
full points will be assigned.HI,
Data package settings for the data to be extracted from R3 to BI can be done through :
1) SBIW>General Settings> Maintain Control Parameters for Data Transfer
These settings are common for all the info packages which extract data from R3.
2) If u want to do settings relevent to specific Infopackage then :
RSA1>Click On the Specific Infopackage>Scheduler(in the Menu Bar)-->DataS Default Data Transfer.
3) And if you want to do DSO Package settings then:
Got to Transaction RSODSO_SETTINGS
Here u can do package settings for DSO activation ,paramenter for SID generation ect.
And selection of Key fileds depends upon the requirement.
Based on the key fields what all the data fileds u want to overwrite or add the corespomding data fields
Regards,
Chaitanya. -
Is there a way of partitioning the data in the cubes
Hello BPC Experts,
we are currently running an Appset with 4 Applciations. Anyway two of these are getting really big.
In BPC for MS there is a way to partitioning the data as I saw in the How tos.
In NW Versions the BPC queries the Multiprovider. Is there a way to split the underlying Basis Cube to several (split by time or Legal Entity).
I think this would help to increase the speed a lot as data could be read in parallel.
Help is very much appreciated.
Daniel
Edited by: Daniel Schäfer on Feb 12, 2010 2:16 PMHi Daniel,
The short answer to your question is that, no, there is not a way to manually partition the infocubes at the BW level. The longer answer comes in several parts:
1. BW automatically partitions the underlying database tables for BPC cubes based on request ID, depending on the BW setting for the cube and the underlying database.
2. BW InfoCubes are very different from MS SQL server cubes (ROLAP approach in BW vs. MOLAP approach usually used in Analysis Services cubes). This results in BW cubes being a lot smaller, reads and writes being highly parallel, and no need for a large rollup operation if the underlying data changes. In other words, you probably wouldn't gain much from semantic partitioning of the BW cubes underlying BPC, except possibly in query performance, and only then if you have very high data volumes (>100 million records).
3. BWA is an option for very large cubes. It is expensive, but if you are talking 100s of millions of records you should probably consider it. It uses a completely different data model than ROLAP or MOLAP and it is highly partition-able, though this is transparent to the BW administrator.
4. In some circumstances it is useful to partition BW cubes. In the BW world, this is usually called "semantic partitioning". For example, you might want to partition cubes by company, time, or category. In BW this is currently supported through manually creating several basic cubes under a multiprovider. In BPC, this approach is not supported. It is highly recommended to not change the BPC-generated Infocubes or Queries in any way.
5. If you have determined that you really need to semantically partition to manage data volumes in BPC, the current best way is probably to have multiple BPC applications with identical dimensions. In other words, partition in the application layer instead of in the data layer.
Hopefully that's helpful to you.
Ethan -
Importing partitioned table data into non-partitioned table
Hi Friends,
SOURCE SERVER
OS:Linux
Database Version:10.2.0.2.0
i have exported one partition of my partitioned table like below..
expdp system/manager DIRECTORY=DIR4 DUMPFILE=mapping.dmp LOGFILE=mapping_exp.log TABLES=MAPPING.MAPPING:DATASET_NAPTARGET SERVER
OS:Linux
Database Version:10.2.0.4.0
Now when i am importing into another server i am getting below error
Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 17 January, 2012 11:22:32
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "MAPPING"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "MAPPING"."SYS_IMPORT_FULL_01": MAPPING/******** DIRECTORY=DIR3 DUMPFILE=mapping.dmp LOGFILE=mapping_imp.log TABLE_EXISTS_ACTION=APPEND
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE failed to create with error:
ORA-00959: tablespace 'MAPPING_ABC' does not exist
Failing sql is:
CREATE TABLE "MAPPING"."MAPPING" ("SAP_ID" NUMBER(38,0) NOT NULL ENABLE, "TG_ID" NUMBER(38,0) NOT NULL ENABLE, "TT_ID" NUMBER(38,0) NOT NULL ENABLE, "PARENT_CT_ID" NUMBER(38,0), "MAPPINGTIME" TIMESTAMP (6) WITH TIME ZONE NOT NULL ENABLE, "CLASS" NUMBER(38,0) NOT NULL ENABLE, "TYPE" NUMBER(38,0) NOT NULL ENABLE, "ID" NUMBER(38,0) NOT NULL ENABLE, "UREID"
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_TG_ID" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type INDEX:"MAPPING"."PK_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_UREID" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_V2" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_PARENT_CT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-39112: Dependent object type CONSTRAINT:"MAPPING"."CKC_SMAPPING_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type CONSTRAINT:"MAPPING"."PK_MAPPING_ITM" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_TG_ID" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."PK_MAPPING" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_UREID" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_V2" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_PARENT_CT" creation failed
Processing object type TABLE_EXPORT/TABLE/COMMENT
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_MAPPING_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_MAPPING_CT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_TG" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_TT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
ORA-39112: Dependent object type INDEX:"MAPPING"."X_PART" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type INDEX:"MAPPING"."X_TIME_T" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type INDEX:"MAPPING"."X_DAY" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
ORA-39112: Dependent object type INDEX:"MAPPING"."X_BTMP" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_TG_ID" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_V2_T" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."PK_MAPPING" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_PARENT_CT" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_UREID" creation failed
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
Job "MAPPING"."SYS_IMPORT_FULL_01" completed with 52 error(s) at 11:22:39Please help..!!
Regards
Umesh Guptayes, i have tried that option as well.
but when i write one tablespace name in REMAP_TABLESPACE clause, it gives error for second one.. n if i include 1st and 2nd tablespace it will give error for 3rd one..
one option, what i know write all tablespace name in REMAP_TABLESPACE, but that too lengthy process..is there any other way possible????
Regards
UmeshAFAIK the option you have is what i recommend you ... through it is lengthy :-(
Wait for some EXPERT and GURU's review on this issue .........
Good luck ....
--neeraj -
How to set the data packet in order??
we have 20 datapackets that is in sequence order how to set in order wise ?
Example:- this is the scenario Data packets is in this order 20,1,10,5,9,13,4,8,13,2,15,19,14Like this
I have to set in order wise how to doit??????HI
I think you sjust should use a process chain.
Call transaction RSPC and build it within the order needed.
regards
Tom -
Data packets/size - Generic extractor
Hi all,
We built a custom function module based datasource and it is extracting data to BW in one big packet of 900,000+ records and the load is taking about 18 hours. We are trying to spilt the BW extraction into smaller data packets to improve performance but unable to do so. Following is our extraction program...
Please let me know where we are doing it wrong...
This Program fetches/build e_t_data. The issue is, program does not splitting into packets as the SAP standard program does.
""Local interface:
*" IMPORTING
*" VALUE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
*" VALUE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
*" VALUE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
*" VALUE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
*" VALUE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY OPTIONAL
*" VALUE(I_DATAPAKID) TYPE SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
*" TABLES
*" E_T_DATA STRUCTURE Z0333W OPTIONAL
*" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT
*" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS
*" EXCEPTIONS
*" NO_MORE_DATA
DATA: lr_range_name TYPE rsselect-fieldnm.
DATA: st_e_t_data TYPE z0333w.
STATICS: l_cursor TYPE cursor.
STATICS: called(1) TYPE c VALUE 'N'.
Maximum number of lines for DB table
STATICS: l_maxsize TYPE sbiwa_s_interface-maxsize.
FIELD-SYMBOLS: <l_range> TYPE ANY,
<l_t_range> TYPE STANDARD TABLE.
IF i_initflag = 'X'.
Initialization: check input parameters
buffer input parameters
prepare data selection
Fill parameter buffer for data extraction calls
g_s_interface-requnr = i_requnr.
g_s_interface-isource = i_dsource.
g_s_interface-maxsize = i_maxsize.
g_s_interface-initflag = i_initflag.
g_s_interface-datapakid = i_datapakid.
g_flag_interface_initialized = sbiwa_c_flag_on.
REFRESH g_t_select.
REFRESH g_t_fields.
APPEND LINES OF i_t_select TO g_t_select.
APPEND LINES OF i_t_fields TO g_t_fields.
ELSE.
first data package of first table -> open cursor
IF g_counter_datapakid = 0.
*--Get selection ranges
LOOP AT i_t_select.
MOVE-CORRESPONDING i_t_select TO <l_range>.
APPEND <l_range> TO <l_t_range>.
ENDLOOP.
l_maxsize = g_s_interface-maxsize.
fetch plants for the company code
PERFORM get_plants.
fetch mast data into internal table as we will be using MAST for validation
of whether BOM exist or not.
SELECT * FROM mast INTO TABLE it_mast
WHERE stlan = '1' OR stlan = '6'.
SORT it_mast BY matnr werks stlan.
Material BOM information
First data package -> OPEN CURSOR
OPEN CURSOR WITH HOLD l_cursor FOR
SELECT mast~matnr
mast~werks
mast~stlnr
mast~stlan
mast~stlal
stko~stlty
FROM mast INNER JOIN stko
ON stkostlnr = maststlnr AND
stkostlal = maststlal
FOR ALL entries IN gt_werks
WHERE mast~matnr IN gr_matnr AND
mast~werks IN gr_werks AND
mast~stlan IN gr_stlan AND
mast~werks = gt_werks-werks AND
mast~stlal = '01' AND
stko~stlty = 'M' AND "Material BOM only
( maststlan = '1' OR maststlan = '6' ).
ENDIF.
Fetch records into interface table.
named E_T_'Name of extract structure'.
REFRESH: gt_mat_bom,gt_mat_bom1.
FETCH NEXT CURSOR l_cursor
APPENDING CORRESPONDING FIELDS
OF TABLE gt_mat_bom
PACKAGE SIZE i_maxsize.
IF sy-subrc <> 0.
CLOSE CURSOR l_cursor.
RAISE no_more_data.
ELSE.
get BOM data and fill E_T_DATA
PERFORM get_bom_data TABLES e_t_data.
ENDIF.
Increment Package
g_counter_datapakid = g_counter_datapakid + 1.
ENDIF.
ENDFUNCTION
Thanks,
Anirudh.I'm not sure, but this might help:
* Fetch records into interface table.
* named E_T_'Name of extract structure'.
DO.
REFRESH: gt_mat_bom,gt_mat_bom1.
FETCH NEXT CURSOR l_cursor
APPENDING CORRESPONDING FIELDS
OF TABLE gt_mat_bom
PACKAGE SIZE i_maxsize.
IF sy-subrc <> 0.
EXIT.
ELSE.
* get BOM data and fill E_T_DATA
PERFORM get_bom_data TABLES e_t_data.
ENDIF.
* Increment Package
g_counter_datapakid = g_counter_datapakid + 1.
ENDDO.
CLOSE CURSOR l_cursor.
RAISE no_more_data.
Rob -
How to use a 2nd partition for data storage?
The 40 GB hard disk on my satellite L10 was delivered with two 20 GB partitions (drive C and D). I now understand (by reading this forum) that it is preferred to use one partition for data storage.
Until now there was no need to use the D-drive. But by now I need the extra space.
How can I use the folders structure in windows XP home (My Documents etc) and put the data on the D-drive.
We work on this computer with several accounts.
I tried to move one folder out of My Documents to the D-drive, but then it can be reached by the other users. Should I use shortcuts towards the D-drive? I hope there is a simple solution.
Thanks in advance.
(I tried my best to write this in English....)Hi
I didnt tried it before but you can try to make your personal folder on D partition. The notebook administrator should set the security settings that can be found under folder properties. Go to Security tab and check all permissions for other users. Choose Deny option for all of them.
I believe that on this way you can use your data folder alone and be sure that other users can not read or execute any of your private data.
Please let me know if it works properly.
Bye -
Need help in logging JTDS data packets
Hi All,
I m having web application which uses SQL Server database.
I have to find out some problems in database connection for that there is need to log the jtds data packets.
I have tried to use class net.sourceforge.jtds.jdbc.TdsCore but in constructor of TdsCore class there are two parameters needed one is ConnectionJDBC2 and another is SQLDiagnostic.
I have tried a lot but it did not allow me to import class *SQLDiagnostic*.
I need help in logging JTDS data packets. If there are any other ways or any body having any idea about logging JTDS data packets/SQLDiagnostic.
Please reply it is urgent...!!
Thanks in advance......!!if you want to use log4j then,
in your project create a file called log4j.properties and add this
# Set root logger level to INFO and its only appender to ConsoleOut.
log4j.rootLogger=INFO,ConsoleOut
# ConsoleOut is set to be a ConsoleAppender.
log4j.appender.ConsoleOut=org.apache.log4j.ConsoleAppender
# ConsoleOut uses PatternLayout.
log4j.appender.ConsoleOut.layout=org.apache.log4j.PatternLayout
log4j.appender.ConsoleOut.layout.ConversionPattern=%-5p: [%d] %c{1} - %m%n
log4j.logger.org.apache.jsp=DEBUG
#Addon for
com.sun.faces.level=FINEGo to your class and add this line
private static final Logger logger = Logger.getLogger("classname");and then you can use
logger.info();
logger.error();
methods -
The maximum size of data packet set in configuration is 25MB.
I want to change the size of data packet as 50 MB. I don't want to change it globally(in SPRO). I want to change it only for specific info-package.
But in info-package system does not allow to change the packet size more than 25MB.
Please suggest the way.
Regards,
DheerajHi..
MAXSIZE = Maximum size of an individual data packet in KB.
The individual records are sent in packages of varying sizes in the data transfer to the Business In-formation Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package. SAP recommends a data package size between 10 and 50 MB.
https://www.sdn.sap.com/irj/sdn/directforumsearch?threadid=&q=cube+size&objid=c4&daterange=all&numresults=15
MAXLINES = Upper-limit for the number of records per data packet
The default setting is 'Max. lines' = 100000
The maximum main memory space requirement per data packet is around
memory requirement = 2 * 'Max. lines' * 1000 Byte,
meaning 200 MByte with the default setting
3 THE FORMULA FOR CALCULATING NUMBER OF RECORDS
The formula for calculating the number of records in a Data Packet is:
packet size = MAXSIZE * 1000 / transfer structure size (ABAP Length)
but not more than MAXLINES.
eg. if MAXLINES < than the result of the formula, then MAXLINES size is transferred into BW.
The size of the Data Packet is the lowest of MAXSIZE * 1000 / transfer structure size (ABAP Length) or MAXLINES.
Message was edited by:
search -
Need inputs how to extract data to BI in multiple Data packets from ECC
HI Experts,
would like to know how can i restrict the data to be fetched in multiple data packets as of now all the data is coming in to BI in a single data packet....
I want to get this data in multiple data packets.
I have checked the infopackage settings in BI its as similar to any other Infopackage which is fetching in multiple data packets.
Is there any posibility to restrict data with in ECCHussein,
Thank you for the helpfull information.
That document gave me lot of information about ANSI extract for BENIFITS.
Thanks
Kumar. -
Re: Reading data from a data packet
We did not find any driver for our instrument so we are trying to capture data using the method i have explained before. Below is an example of the data packets that we are receiving in our serial port monitoring software.
First data packet 13 88 04 4C 00 00 01 F3 00 00 7B 37 02 EE
Second data packet 13 88 04 4C 00 00 01 EE 00 00 79 D9 02 EE
Third data packet 13 88 04 4C 00 00 01 EB 00 00 78 BA 02 EE
Here as we have seen above 13 88 (5000=Frequency in decimal) 04 4C (represents1100= voltage in decimal) is constant and the ending is represented by 02EE. Apart from tis the rest (for the first packet 01 F3= 499 assuming current and 7B 37 = 31543 assuming power keeps on changing in every packet and they all have different scaling ). The 00 00 just represents the padding between them. Now we have to read the values in altogather say in different boxes and produce them in LabVIEW front panel. Here i have attached a picture of the instrument reading side by side with the serial monitoring software reading. How do i do that? i have read something called visa read and write but they only accepts strings not in bytes. Any kind of help is appreciated.Duplicate Post
-
How to select more than one data packet?
Hi,
I have uploaded data using 3 different data packets. However, for each of these packets there are some errors.
Using Monitor > PSA Maintenance, I want to display error data records for all three different data packets in one screen. These erratic data will be send to our users for their rectification. However, I can only select and disply erratic data records packet by packet. So this has to be done three time instead of once.
Can u advise on how to display all erratic data records in one screen.Hello Fulham FC,
How r u ?
I feel it is possible to select all 2 or more requests at once. Provide
...No. of Records ---> count the total no of records in the packages u select.
...From Record -
> 2147483647
Then try CONTINUE. In our system its throwing Dump !!!
I believe dude, in PSA Maintenence there is a button SELECT ALL & even in the data package also it is allowing to select two or more. There should be some way to do this.
Best Regards....
Sankar Kumar
+91 98403 47141 -
Hit
i want to know the difference between data package and data packet .when this comes in sap bw
with regards
tusharHello,
Data package term is related to DTP which is used to load Data from PSA to further Data Targets
Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
Data Packet Term is related to Info Package which is used to load data from Source System to BI (PSA).
As per SAP standard, we prefer to have 50,000 records per one data packet.
For every data packet, it does commit & save --- so less no. of data packets required.
If you have 1 lakh records per data packet and there is an error in the last record, the entire packet gets failed.
Hope it helps! -
Arrived in BW Processing: Data packet not yet processed
Hello Gurus,
Data is being loaded from export datasource (8RL*****) ti 2 data targets
The overall QM status is red (says processing is overdue)
The details tab shows
Extraction: Error occurred
Transfer(IDoc and TRFC): Error occurred
Processing (Data packet): Error occurred
-- Transfer rule: Error occurred
Bewegungsdaten received.Processing being started
Message missing(35676 records) : Transfer rule finished
Update rule (0 records):Missing messages
Update(0 new/0 changed): Missing message
Processing end: Missing message
I checked the LUW in SM58 but didnt find anything.
I checked infosource (PSA) - one request is in RED, but when i check the data in PSA there are no errors.
What should I do in this case?
What might be the excat error.
Kindly inform.
Regards,
NIKEKABHi,
Check the DataSource in RSA3, if it is working fine and able to see the data in RSA3, there is no problem in DS level, then checl the Mappings and any routines in BW for that DS, if this is also fine then check the below options.
See Dumps in ST22, SM21 also.
Check RFC Connection between ECC and BW systems, i.e. RSA1-->Source System->Right Click on Source system and Check.
You must have the following profiles to BWREMOTE or ALEREMOTE users.So add it. Bcoz either of these two users will use in background to get extract the data from ECC, so add these profiels in BW.
S_BI-WHM_RFC, S_BI-WHM_SPC, S_BI-WX_RFC
And also check the following things.
1.Connections from BW to ECC and ECC to BW in SM59
2.Check Port,Partner Profiles,and Message Types in WE20 in ECC & BW.
3.Check Dumps in ST22, and SM21.
4.If Idocs are stuck i.e see the OLTP Idoc numbers in RSMO Screen in (BW) detials tab see in bottom, you can see OLTP Idoc number and take the Idoc numbers and then goto to ECC see the status in WE05 or WE02, if error then check the log else goto to BD87 in ECC and give the Idoc numbers and execute manually and see in RSMO and refresh.
5.Check the LUWs struck in SM58,User Name = * (star) and run it and see Strucked LUWs and select our LUW and execute manually and see in RSMO in BW.
See in SDN
Re: Loading error in the production system
Thanks
Reddy -
Data packet not yet processing in ODS load??
Hi all,
I got an error when I loaded data from IS to the ODS. Can someone let me know why and how to resolve it. Thank you in advance.
Here is the error message in the monitor:
<b>Warning: data packet 1 & 2 arrived BW; processing: data packet not yet processing.
(No data packet numbers could be determined for request REQU_77H7ERP54VXW5PZZP5J6DYKP7)</b>
<b>Processing end:
transfer rules (0 record): missing message
Update PSA (0 record): messing messages
Update rules (0 record): messging messages</b>John,
I dont think its space problem.In st22 go with detail note.
What happend, how to correct it.Will help you to solve the problem.
Check this note <b>613440</b> also.
<b>Note : 647125</b>
Symptom
A DYNPRO_FIELD_CONVERSION dump occurs on screen 450 of the RSM1 function group (saplrsm1).
Other terms
DYNPRO_FIELD_CONVERSION, 450, SAPLRSM1
Reason and Prerequisites
This is caused by a program error.
The screen contains unused, hidden fields/screen elements that are too small for the screen check that was intensified with the current Basis patch (kernel patch 880). These fields originate in the 4.0B period of BW 1.0 and are never used.
Solution
Depending on your BW system release, you must solve the problem as follows:
BW 3.0B
ImportSupport Package 14 for 3.0B (BW 3.0B Patch 14 or SAPKW30B14) into your BW system. This Support Package will be available when note 571695 with the short text,"SAPBWNews BW 3.0B Support Package 14", which describes this Support Package in more detail, is released for customers.
BW 3.1 Content
ImportSupport Package 8 for 3.1 Content (BW 3.10 Patch 08 or SAPKW31008) into your BW system.This Support Package will be availablewhen note 571743 with the short text, "SAPBWNews BW 3.1 Content Support Package 08", is released for customers.
The dump occurs with the invisible G_NEW_DATUM date field on the bottom right of the screen, which is only 1 byte long and can be deleted.
You can delete the following unused fields/screen elements:
%A_G_NEW_NOW Selectionfield group
G_NEW_ZEIT Input/output field
G_NEW_UNAME Input/output field
G_NEW_DATUM Input/output field
%#AUTOTEXT021 Text field
G_NEW_NOW Selection button
G_NEW_BATCH Selection button
You can delete these fields/screen elements because they are not used anywhere.
This deletion does not cause any problems.
After you delete the fields/screen elements, you must also delete the following rows in the flow logic in screen 450:
FIELD G_NEW_DATUM MODULE DOKU_NEW_DATUM.
FIELD G_NEW_ZEIT MODULE DOKU_NEW_ZEIT.
The function group is then syntactically correct.
Unfortunately, we cannot provide an advance correction.
The aforementioned notes may already be available to provide information in advance of the Support Package release.However, the short text will still contains the words "preliminary version" in this case.
For more information on BW Support Packages, see note 110934.
Thanks
Ram
Maybe you are looking for
-
Activation of duplicate check for vendor or customer account by matchcode
Hi, Could you please me know how to do the activation of duplicate check for vendor as well as customer account by matchcode. Many Thanks, Panneer
-
How to implement a 1-n relationship
I'm using oracle 8.1 and currently learning JDO. Havin the following schema : DROP TABLE permission_group_map CASCADE CONSTRAINTS; CREATE TABLE permission_group_map ( permission_group_map_oid NUMBER(10) NOT NULL, version_oid NUMBER(3) NOT NULL, group
-
I also can use internet explorer just fine so I know it's not a problem with my internet provider. Unable to connect is the message I get. Then... The site could be temporarily unavailable or too busy. Try again in a few moments. If you are unable to
-
I have an NVIDIA GeForce 9800GT. Is this powerful enough for the new fetures in CS6 Production Premium? What card would you recommend for under $500?
-
Dreamweaver rewriting my page directives
Hi there all, Quite a straightforward question here. I'm in the position where I'm having to use both Dreamweaver (CS4) and Microsoft Visual Studio 2008 to create an asp.NET application. VS2008 for the coding part and Dreamweaver for everything else,