Generat additional records in the startroutine of a transformation
Dear "Gurus", I have the following chalange:
the records arriving into the transformation are structured like this
20080515 ; 20080721 ; 300'000.- -> from date ; to date ; total amount
but what I need is a record for each single day with the corresponding part of the amount, but: only working-day have to be counted (= 20080515 - 20080721 = 48 Working days)!!
the result should look like this:
20080515 ; 300'000.- / 48 = 6' 250.-
20080516 ; 6'250.-
20080517 ; 6'250.-
20080518 ; 6'250.-
20080519 ; 6'250.-
20080721; 6'250.-
... and now instead of the incoming record all the result records have to be proceeded trough the transformation - but how?? Does anybody have an example (once I have an example I how to "send back" more than a single record to the source_package I should be able to handle the rest ...)????
Thanks for your help in advance!
regards
oliver
Hi Oliver,
You will need to use an 'Expert Routine'.
Example:
FIELD-SYMBOLS:
<SOURCE_FIELDS> TYPE tys_SC_1.
DATA:
RESULT_FIELDS TYPE tys_TG_1.
$$ begin of routine - insert your code only below this line -
DATA: L_SOURCE_PACKAGE TYPE tys_sc_1,
l_mzfacul TYPE /bic/mzfacul.
LOOP AT SOURCE_PACKAGE INTO L_SOURCE_PACKAGE.
Select single * from /bic/mzfacul
into l_mzfacul
where /bic/zflyfac = l_source_package-/bic/zflyfac.
RESULT_FIELDS-calyear = L_SOURCE_PACKAGE-calyear.
RESULT_FIELDS-/bic/zmember = L_SOURCE_PACKAGE-/bic/zmember.
RESULT_FIELDS-/bic/zhousing = L_SOURCE_PACKAGE-/bic/zhousing.
RESULT_FIELDS-/bic/zexamtyp = L_SOURCE_PACKAGE-/bic/zexamtyp.
RESULT_FIELDS-/bic/zpreeduc = L_SOURCE_PACKAGE-/bic/zpreeduc.
RESULT_FIELDS-gender = L_SOURCE_PACKAGE-gender.
RESULT_FIELDS-/bic/zfaculmem = L_mzfacul-/bic/zfacul.
IF L_SOURCE_PACKAGE-/bic/zg_fhs = 0.
ELSE.
RESULT_FIELDS-/bic/zfacul = 'FHS'.
RESULT_FIELDS-/bic/zg_EDUOFF =
L_SOURCE_PACKAGE-/bic/zg_fhs.
APPEND RESULT_FIELDS TO RESULT_PACKAGE.
ENDIF.
ENDLOOP.
I hope this is enough.
Success,
Udo
Similar Messages
-
PT_QTA00 is generating repetative records for the same quota type
PT_QTA00 is generating repetative records for the same quota type and one more thing i have already maintained 10 days of quota in 2006 and when i run for the same employee its creating one more record with 10 days again how can we restrict it in configuration
As said earlier, you want to setup Time Constraints for It2006 which is in table: V_T582A
You should set the Time Constraint to 2.
HR master data uses the following three time constraints:
Time Constraint 1
For the entire time that the employee works at the enterprise, exactly one valid infotype record must exist. The validity periods of the individual records must not overlap. If a new record is created, the system automatically uses the start date of the new record as the delimitation date of the old record. Gaps are only allowed between the employeeu2019s entry date and the start date of the first record.
Time constraint 1 must be used for all of the infotypes containing information that must be available at all times. This is particularly true of personal and organizational assignment data.
If a record is delimited because of time constraint 1, the system displays an appropriate message.
Time Constraint 2
No more than one valid record can exist at any one time. Records with constraint 2 must not overlap. Their existence is not obligatory. If a new record is created, the system automatically delimits the previous record, if one exists.
If a record is delimited because of time constraint 2, the system displays an appropriate message.
Time Constraint 3
Any number of valid records can exist at any one time. The individual records do not conflict with each other. -
Adding additional Records in TransformationRoutine.
Hi,
I'm trying to add additional record to the RESULT_PACKAGE in "End Routine" based on some conditions and trying to add it to the infocube.
I'm doing this by assigning some values to the work_area and appending it to the RESULT_PACKAGE. But at the time of running the "Data Transfer Process" i'm getting error. (Message: Error while writing it to the fact table). If I don't append any records then it is working fine. But I have a requirement to do that. How can I achieve that? Can anybody help me please?
Best Regards,
Anbu.HI,
You will find a HowTo guide here : https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e73bfc19-0e01-0010-23bc-ef0ad53f2fab
Hope that helps.
Dieu -
ABAP - CUA : Initial load : too many records in the CUA?
We are running :
SP03 for IDENT. CENTER DESIGNTIME 7.1 Patch 0
SP03 for IDENTITY CENTER RUNTIME 7.1 Patch 1
SP03 for NW IDM IC UIS 7.00 Patch 1
We have connected our customer's CUA system to IdM : we created an Identity Store called 'SAP_Master', created the CUA repository, defined the required attributes as defined in the guide 'IDM 7.1 - IdM For SAP Systems - Configuration 7-3.pdf', created the jobs based upon the templates etc. The dispatcher used has 'run provisioning jobs' disabled.
On our sandbox server, when we connect to our sandbox CUA system (CUA_YS5_200), everything is ok, the 'AS ABAP - Initial load' job with only 'ReadABAPRoles' enabled, runs fine.
On our QA system, when we connect to our 'production' CUA system (CUA_YP0_400), the 'AS ABAP - Initial load' job with
only 'ReadABAPRoles' enabled, finished with message 'could not start job, rescheduling'. Since there is a huge number of records (we looked it up in the system : 311.679 records), we decided to switch on parameter 'bootstrap job'. Now the result is that it takes forever, after half a day the job is still running. In the database, table 'sapCUA_YP0_400role' is still completely empty (no records). Therefore, it seemed interesting to connect our QA IdM system to our development CUA system (CUA_YS5_200). After a while, the exact same job has finished and table 'sapCUA_YS5_200role' contains 18.580 records.
After some additional testing, we might have discoved the cause of the issue could be that the number of records in our CUA is too big.
In the java code of the fromSAP pass there are 2 calls to the SAP system for reading the roles into Idm. The first one reads table USRSYSACT (311.000 records), the second one reads table USRSYSACTT (1.000.000 records). All these records are stored into a java hashmap - we think that 1 million records exceeds the hashmaps capability, although no java error is thrown.
When we debug the functionmodule RFC_READ_TABLE and change the rowcount to 100.000 then everything works fine. When we set the rowcount to 200.000 the java-code does not generate an error but the job in idm never
ends...
When running functionmodule RFC_READ_TABLE in the backend system the 1.000.000 records are processed in less than one minute. So apparently, the issue is related to the processing in the Java code.
Java Dispatcher heap size is set to 1024.
Anybody already came accros this issue?
Thanks & best regards,
KevinInstalling the patch, re- importing the SAP Provisioning framework (I selected 'update') and recreating the jobs didn't yield any result.
When examining pass 'ReadABAPRoles' of Job 'AS ABAP - Initial Load' -> tab 'source', there are no scripts used .
After applying the patch we decided anyway to verify the scripts (sap_getRoles, sap_getUserRepositories) in our Identity Center after those of 'Note 1398312 - SAP NW IdM Provisioning Framework for SAP Systems' , and they are different
File size of SAP Provisioning Framework_Folder.mcc of SP3 Patch 0 and Patch 1 are also exactly the same.
Opening file SAP Provisioning Framework_Folder.mcc with Wordpad : searched for 'sap_getRoles' :
<GLOBALSCRIPT>
<SCRIPTREVISIONNUMBER/>
<SCRIPTLASTCHANGE>2009-05-07 08:00:23.54</SCRIPTLASTCHANGE>
<SCRIPTLANGUAGE>JScript</SCRIPTLANGUAGE>
<SCRIPTID>30</SCRIPTID>
<SCRIPTDEFINITION> ... string was too long to copy
paste ... </SCRIPTDEFINITION>
<SCRIPTLOCKDATE/>
<SCRIPTHASH>0940f540423630687449f52159cdb5d9</SCRIPTHASH>
<SCRIPTDESCRIPTION/>
<SCRIPTNAME>sap_getRoles</SCRIPTNAME>
<SCRIPTLOCKSTATE>0</SCRIPTLOCKSTATE>
-> Script last change 2009-05-07 08:00:23.54 -> that's no update !
So I assume the updates mentioned in Note 1398312 aren't included in SP3 Patch 1. Manually replaced the current scripts with those of the note and re- tested : no luck. Same issue.
Thanks again for the help,
Kevin -
Not generating any register in the execution of a script from a package
Hi everybody,
I am developing some script logics and I have got some problems when executing them. Basically the problem is the same in all of them, since the structure of the scripts is very similar.
What I am trying to do is to calculate a certain data value from other data which are located in a different region within the same application. In order to develop it I have used the lookup function referred to the same application of the destination data. When I save and validate it, I obtain no problem, and it is done successfully. In addition, when I run the package it results in success.
However, the package log shows that no registers are generated. It is really difficult to understand for me because exactly the same code is sometimes working correctly,and other times is wrong with such a little difference like the base member selection of one dimension.
I attach two scripts, the one which is working, and another one that is not working. Moreover, I attach the log of these packages to show you the result that is obtained:
Script of the good package
*LOOKUP PRESUP_PROTO
*DIM CONCEP_RECIPROCO="TOTAL_CONCEPTO"
*DIM CONTRAPARTES="TOTAL_CONTRAPARTE"
*DIM GRUPOS="NON_GROUP"
*DIM MONEDA_REPORTE="LC"
*DIM PROY_RECIPROCO="TOTAL_DIM_PR"
*DIM MEASURES="PERIODIC"
*DIM MEDIDA="DATO_FINANCIERO"
*DIM ORIGEN="DATO_PRESUPUESTADO"
*DIM TIPO_DATO="PPTO_DET"
*DIM TIPO_INVERSION="DUMMY_INVERSION"
*DIM NEGOCIO="TOTAL_NEGOCIO"
*DIM CEBE="TOT_DIM_CEBE"
*DIM PROYECTO="TOTAL_PROYECTO"
*DIM TIEMPO_MENSUAL="2011.JAN"
*DIM NEGOCIO_SOCIEDAD="0713" <----
difference
*DIM BDI:CONCEPTO2="BDI"
*ENDLOOKUP
*XDIM_MEMBERSET PROYECTO="S_DUMMY"
*XDIM_MEMBERSET CONCEP_RECIPROCO="DUMMY_CR"
*XDIM_MEMBERSET CONTRAPARTES="DUMMY_CNTRPT"
*XDIM_MEMBERSET GRUPOS="NON_GROUP"
*XDIM_MEMBERSET MONEDA_REPORTE="LC"
*XDIM_MEMBERSET PROY_RECIPROCO="DUMMY_PR"
*XDIM_MEMBERSET MEASURES="PERIODIC"
*XDIM_MEMBERSET MEDIDA="DATO_FINANCIERO"
*XDIM_MEMBERSET ORIGEN="DUMMY_ORIGEN"
*XDIM_MEMBERSET TIPO_DATO="PPTO_DET"
*XDIM_MEMBERSET TIPO_INVERSION="DUMMY_INVERSION"
*XDIM_MEMBERSET NEGOCIO="DUMMY_NEGOCIO" <----
difference
*XDIM_MEMBERSET CEBE="DUMMY_CEBE"
*XDIM_MEMBERSET NEGOCIO_SOCIEDAD="0713" <----
difference
*XDIM_MEMBERSET TIEMPO_MENSUAL="2011.JAN"
*XDIM_MEMBERSET CONCEPTO2="016.0228"
*WHEN CEBE
*IS "DUMMY_CEBE"
*REC(EXPRESSION=LOOKUP(BDI),CONCEPTO2="016.0228")
*REC(EXPRESSION=LOOKUP(BDI),CONCEPTO2="016.0217")
*ENDWHEN
*COMMIT
Script of the wrong package
*LOOKUP PRESUP_PROTO
*DIM CONCEP_RECIPROCO="TOTAL_CONCEPTO"
*DIM CONTRAPARTES="TOTAL_CONTRAPARTE"
*DIM GRUPOS="NON_GROUP"
*DIM MONEDA_REPORTE="LC"
*DIM PROY_RECIPROCO="TOTAL_DIM_PR"
*DIM MEASURES="PERIODIC"
*DIM MEDIDA="DATO_FINANCIERO"
*DIM ORIGEN="DATO_PRESUPUESTADO"
*DIM TIPO_DATO="PPTO_DET"
*DIM TIPO_INVERSION="DUMMY_INVERSION"
*DIM NEGOCIO="TOTAL_NEGOCIO"
*DIM CEBE="TOT_DIM_CEBE"
*DIM PROYECTO="TOTAL_PROYECTO"
*DIM TIEMPO_MENSUAL="2011.JAN"
*DIM NEGOCIO_SOCIEDAD="0713_N_3111" <----
difference
*DIM BDI:CONCEPTO2="BDI"
*ENDLOOKUP
*XDIM_MEMBERSET PROYECTO="S_DUMMY"
*XDIM_MEMBERSET CONCEP_RECIPROCO="DUMMY_CR"
*XDIM_MEMBERSET CONTRAPARTES="DUMMY_CNTRPT"
*XDIM_MEMBERSET GRUPOS="NON_GROUP"
*XDIM_MEMBERSET MONEDA_REPORTE="LC"
*XDIM_MEMBERSET PROY_RECIPROCO="DUMMY_PR"
*XDIM_MEMBERSET MEASURES="PERIODIC"
*XDIM_MEMBERSET MEDIDA="DATO_FINANCIERO"
*XDIM_MEMBERSET ORIGEN="DUMMY_ORIGEN"
*XDIM_MEMBERSET TIPO_DATO="PPTO_DET"
*XDIM_MEMBERSET TIPO_INVERSION="DUMMY_INVERSION"
*XDIM_MEMBERSET NEGOCIO="N_3111" <----
difference
*XDIM_MEMBERSET CEBE="DUMMY_CEBE"
*XDIM_MEMBERSET NEGOCIO_SOCIEDAD="0713_N_3111" <----
difference
*XDIM_MEMBERSET TIEMPO_MENSUAL="2011.JAN"
*XDIM_MEMBERSET CONCEPTO2="016.0228"
*WHEN CEBE
*IS "DUMMY_CEBE"
*REC(EXPRESSION=LOOKUP(BDI),CONCEPTO2="016.0228")
*REC(EXPRESSION=LOOKUP(BDI),CONCEPTO2="016.0217")
*ENDWHEN
*COMMIT
Log of the good package
ENDWHEN ACCUMULATION: 2 RECORDS ARE GENERATED.
Log of the wrong package
ENDWHEN ACCUMULATION: 0 RECORDS ARE GENERATED.
Many thanks in advance,
Any hekp will be much appreciated,
AntonioHi Mario,
Have you tried the break statement?
http://ss64.com/ps/break.html
Very simple example:
Write-Host 1
Write-Host 2
break
Write-Host 3
Don't retire TechNet! -
(Don't give up yet - 12,700+ strong and growing) -
Adding Records In The User Exit
I have a requirement to modify the transactional data user exit 0CO_PC_PCP_01. Currently this datasource extracts cost estimate data and using some complex rules I need to determine the active cost estimate for each month and duplicate the records according.
ie if a cost estimate dated 01.01.2005 is valid for all of 2005 then I need to duplicate all corrsponding records 11 times ie one set of additional records per month.
I have created the ABAP code to do this and for small volumes of data it works perfectly. However if I increase the data volume the exit seems to run and run.
The code is as efficient as it can be and there are no database selects within it.
My questions are:
1)is it acceptable to add a high volume of new data in the user exit or should I be looking at an alternative.
2) Does the user exit get called multiple times during a dataload. ie once per datapackage. If so then am I correct in saying it is dangerous to add records this way because some of the records added could already exist and be processed within another datapackage and therefore records would then be duplicated.
3) One approach I have thought of but not yet tested is to use the function module(s) used by the SAP standard extractor to populate a table on R3 and then use a custome datasource to retrieve the data from this table.
Any thoughts/views/answers on the above would be appreciated.
Many thanksSimon,
1. Lets say that the datapacket size is 20000 and all of your records are active. Then when it exits the user exit, the size of the datapacket would be 20000 X 12. that is a very large packet. Probably not a good idea to do this.
2. The user exit will be called for each datapacket. That is correct and yes, it would be wrong to do this in a datapaket. If a record exists in two datapackets, then records are generated for this multiple times.
3. Why are you not considering applying the logic in BW rather than doing it in R/3. That should be relatively easy.
This is how you can do it. When loading the data into BW, load it into an ODS object. Add a new Infoobject to the ODS object (ZACTIVE) for example. In the update rules between infosource and ODS, apply the logic that you are using to determine if each record is active or not. Now from this ODS, load it to either INfoCube or ODS, wherver you want to. In the update rules between the two, you can use the "return table" to generate multiple records where ZACTIVE = true, or you can generate in the start routine instead.
By doing this you emilinate the case of generating duplicate records.
Hope that helps.
Gova -
CBO generating different plans for the same data in similar Environments
Hi All
I have been trying to compare an SQL from 2 different but similar environments build of the same hardware specs .The issue I am facing is environment A, the query executes in less than 2 minutes with plan mostly showing full table scans and hash join whereas in environment B(problematic), it times out after 2 hours with an error of unable to extend table space . The statistics are up to date in both environments for both tables and indexes . System parameters are exactly similar(default oracle for except for multiblock_read_count ).
Both Environment have same db parameter for db_file_multiblock_read_count(16), optimizer(refer below),hash_area_size (131072),pga_aggregate_target(1G),db_block_size(8192) etc . SREADTIM, MREADTIM, CPUSPEED, MBRC are all null in aux_stats in both environment because workload was never collected i believe.
Attached is details about the SQL with table stats, SQL and index stats my main concern is CBO generating different plans for the similar data and statistics and same hardware and software specs. Is there any thing else I should consider .I generally see environment B being very slow and always plans tend to nested loops and index scan whereas what we really need is a sensible FTS in many cases. One of the very surprising thing is METER_CONFIG_HEADER below which has just 80 blocks of data is being asked for index scan.
show parameter optimizer
optimizer_dynamic_sampling integer 2
optimizer_features_enable string 10.2.0.4
optimizer_index_caching integer 0
optimizer_index_cost_adj integer 100
optimizer_mode string ALL_ROWS
optimizer_secure_view_merging boolean TRUE
**Environment**
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Solaris: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
Note: : There are slight difference in the no of records in the attached sheet.However, I wanted to tell that i have tested with exact same data and was getting similar results but I couldn't retain the data untill collecting the details in the attachment
TEST1 COMPARE TABLE LEVE STATS used by CBO
ENVIRONMENT A
TABLE_NAME NUM_ROWS BLOCKS LAST_ANALYZED
ASSET 3607425 167760 5/02/2013 22:11
METER_CONFIG_HEADER 3658 80 5/01/2013 0:07
METER_CONFIG_ITEM 32310 496 5/01/2013 0:07
NMI 1899024 33557 18/02/2013 10:55
REGISTER 4830153 101504 18/02/2013 9:57
SDP_LOGICAL_ASSET 1607456 19137 18/02/2013 15:48
SDP_LOGICAL_REGISTER 5110781 78691 18/02/2013 9:56
SERVICE_DELIVERY_POINT 1425890 42468 18/02/2013 13:54
ENVIRONMENT B
TABLE_NAME NUM_ROWS BLOCKS LAST_ANALYZED
ASSET 4133939 198570 16/02/2013 10:02
METER_CONFIG_HEADER 3779 80 16/02/2013 10:55
METER_CONFIG_ITEM 33720 510 16/02/2013 10:55
NMI 1969000 33113 16/02/2013 10:58
REGISTER 5837874 120104 16/02/2013 11:05
SDP_LOGICAL_ASSET 1788152 22325 16/02/2013 11:06
SDP_LOGICAL_REGISTER 6101934 91088 16/02/2013 11:07
SERVICE_DELIVERY_POINT 1447589 43804 16/02/2013 11:11
TEST ITEM 2 COMPARE INDEX STATS used by CBO
ENVIRONMENT A
TABLE_NAME INDEX_NAME UNIQUENESS BLEVEL LEAF_BLOCKS DISTINCT_KEYS AVG_LEAF_BLOCKS_PER_KEY AVG_DATA_BLOCKS_PER_KEY CLUSTERING_FACTOR NUM_ROWS
ASSET IDX_AST_DEVICE_CATEGORY_SK NONUNIQUE 2 9878 67 147 12982 869801 3553095
ASSET IDX_A_SAPINTLOGDEV_SK NONUNIQUE 2 7291 2747 2 639 1755977 3597916
ASSET SYS_C00102592 UNIQUE 2 12488 3733831 1 1 3726639 3733831
METER_CONFIG_HEADER SYS_C0092052 UNIQUE 1 12 3670 1 1 3590 3670
METER_CONFIG_ITEM SYS_C0092074 UNIQUE 1 104 32310 1 1 32132 32310
NMI IDX_NMI_ID NONUNIQUE 2 6298 844853 1 2 1964769 1965029
NMI IDX_NMI_ID_NK NONUNIQUE 2 6701 1923072 1 1 1922831 1923084
NMI IDX_NMI_STATS NONUNIQUE 1 106 4 26 52 211 211
REGISTER REG_EFFECTIVE_DTM NONUNIQUE 2 12498 795 15 2899 2304831 4711808
REGISTER SYS_C00102653 UNIQUE 2 16942 5065660 1 1 5056855 5065660
SDP_LOGICAL_ASSET IDX_SLA_SAPINTLOGDEV_SK NONUNIQUE 2 3667 1607968 1 1 1607689 1607982
SDP_LOGICAL_ASSET IDX_SLA_SDP_SK NONUNIQUE 2 3811 668727 1 2 1606204 1607982
SDP_LOGICAL_ASSET SYS_C00102665 UNIQUE 2 5116 1529606 1 1 1528136 1529606
SDP_LOGICAL_REGISTER SYS_C00102677 UNIQUE 2 17370 5193638 1 1 5193623 5193638
SERVICE_DELIVERY_POINT IDX_SDP_NMI_SK NONUNIQUE 2 4406 676523 1 2 1423247 1425890
SERVICE_DELIVERY_POINT IDX_SDP_SAP_INT_NMI_SK NONUNIQUE 2 7374 676523 1 2 1458238 1461108
SERVICE_DELIVERY_POINT SYS_C00102687 UNIQUE 2 4737 1416207 1 1 1415022 1416207
ENVIRONMENT B
TABLE_NAME INDEX_NAME UNIQUENESS BLEVEL LEAF_BLOCKS DISTINCT_KEYS AVG_LEAF_BLOCKS_PER_KEY AVG_DATA_BLOCKS_PER_KEY CLUSTERING_FACTOR NUM_ROWS
ASSET IDX_AST_DEVICE_CATEGORY_SK NONUNIQUE 2 8606 121 71 16428 1987833 4162257
ASSET IDX_A_SAPINTLOGDEV_SK NONUNIQUE 2 8432 1780146 1 1 2048170 4162257
ASSET SYS_C00116157 UNIQUE 2 13597 4162263 1 1 4158759 4162263
METER_CONFIG_HEADER SYS_C00116570 UNIQUE 1 12 3779 1 1 3734 3779
METER_CONFIG_ITEM SYS_C00116592 UNIQUE 1 107 33720 1 1 33459 33720
NMI IDX_NMI_ID NONUNIQUE 2 6319 683370 1 2 1970460 1971313
NMI IDX_NMI_ID_NK NONUNIQUE 2 6597 1971293 1 1 1970771 1971313
NMI IDX_NMI_STATS NONUNIQUE 1 98 48 2 4 196 196
REGISTER REG_EFFECTIVE_DTM NONUNIQUE 2 15615 1273 12 2109 2685924 5886582
REGISTER SYS_C00116748 UNIQUE 2 19533 5886582 1 1 5845565 5886582
SDP_LOGICAL_ASSET IDX_SLA_SAPINTLOGDEV_SK NONUNIQUE 2 4111 1795084 1 1 1758441 1795130
SDP_LOGICAL_ASSET IDX_SLA_SDP_SK NONUNIQUE 2 4003 674249 1 2 1787987 1795130
SDP_LOGICAL_ASSET SYS_C004520 UNIQUE 2 5864 1795130 1 1 1782147 1795130
SDP_LOGICAL_REGISTER SYS_C004539 UNIQUE 2 20413 6152850 1 1 6073059 6152850
SERVICE_DELIVERY_POINT IDX_SDP_NMI_SK NONUNIQUE 2 3227 660649 1 2 1422572 1447803
SERVICE_DELIVERY_POINT IDX_SDP_SAP_INT_NMI_SK NONUNIQUE 2 6399 646257 1 2 1346948 1349993
SERVICE_DELIVERY_POINT SYS_C00128706 UNIQUE 2 4643 1447946 1 1 1442796 1447946
TEST ITEM 3 COMPARE PLANS
ENVIRONMENT A
Plan hash value: 4109575732
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 13 | 2067 | | 135K (2)| 00:27:05 |
| 1 | HASH UNIQUE | | 13 | 2067 | | 135K (2)| 00:27:05 |
|* 2 | HASH JOIN | | 13 | 2067 | | 135K (2)| 00:27:05 |
|* 3 | HASH JOIN | | 6 | 900 | | 135K (2)| 00:27:04 |
|* 4 | HASH JOIN ANTI | | 1 | 137 | | 135K (2)| 00:27:03 |
|* 5 | TABLE ACCESS BY INDEX ROWID| NMI | 1 | 22 | | 5 (0)| 00:00:01 |
| 6 | NESTED LOOPS | | 1 | 131 | | 95137 (2)| 00:19:02 |
|* 7 | HASH JOIN | | 1 | 109 | | 95132 (2)| 00:19:02 |
|* 8 | TABLE ACCESS FULL | ASSET | 36074 | 1021K| | 38553 (2)| 00:07:43 |
|* 9 | HASH JOIN | | 90361 | 7059K| 4040K| 56578 (2)| 00:11:19 |
|* 10 | HASH JOIN | | 52977 | 3414K| 2248K| 50654 (2)| 00:10:08 |
|* 11 | HASH JOIN | | 39674 | 1782K| | 40101 (2)| 00:08:02 |
|* 12 | TABLE ACCESS FULL | REGISTER | 39439 | 1232K| | 22584 (2)| 00:04:32 |
|* 13 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 4206K| 56M| | 17490 (2)| 00:03:30 |
|* 14 | TABLE ACCESS FULL | SERVICE_DELIVERY_POINT | 675K| 12M| | 9412 (2)| 00:01:53 |
|* 15 | TABLE ACCESS FULL | SDP_LOGICAL_ASSET | 1178K| 15M| | 4262 (2)| 00:00:52 |
|* 16 | INDEX RANGE SCAN | IDX_NMI_ID_NK | 2 | | | 2 (0)| 00:00:01 |
| 17 | VIEW | | 39674 | 232K| | 40101 (2)| 00:08:02 |
|* 18 | HASH JOIN | | 39674 | 1046K| | 40101 (2)| 00:08:02 |
|* 19 | TABLE ACCESS FULL | REGISTER | 39439 | 500K| | 22584 (2)| 00:04:32 |
|* 20 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 4206K| 56M| | 17490 (2)| 00:03:30 |
|* 21 | TABLE ACCESS FULL | METER_CONFIG_HEADER | 3658 | 47554 | | 19 (0)| 00:00:01 |
|* 22 | TABLE ACCESS FULL | METER_CONFIG_ITEM | 7590 | 68310 | | 112 (2)| 00:00:02 |
Predicate Information (identified by operation id):
2 - access("METER_CONFIG_HEADER_SK"="METER_CONFIG_HEADER_SK")
3 - access("NETWORK_TARIFF_CD"="NETWORK_TARIFF_CD")
4 - access("SERVICE_DELIVERY_POINT_SK"="TMP"."SERVICE_DELIVERY_POINT_SK")
5 - filter("ROW_CURRENT_IND"='Y' AND ("NMI_STATUS_CD"='A' OR "NMI_STATUS_CD"='D'))
7 - access("ASSET_CD"="EQUIP_CD" AND "SAP_INT_LOG_DEVICE_SK"="SAP_INT_LOG_DEVICE_SK")
8 - filter("ROW_CURRENT_IND"='Y')
9 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
10 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
11 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
12 - filter("REGISTER_TYPE_CD"='C' AND (SUBSTR("REGISTER_ID_CD",1,1)='4' OR
SUBSTR("REGISTER_ID_CD",1,1)='5' OR SUBSTR("REGISTER_ID_CD",1,1)='6') AND "ROW_CURRENT_IND"='Y')
13 - filter("ROW_CURRENT_IND"='Y')
14 - filter("ROW_CURRENT_IND"='Y')
15 - filter("ROW_CURRENT_IND"='Y')
16 - access("NMI_SK"="NMI_SK")
18 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
19 - filter("REGISTER_TYPE_CD"='C' AND (SUBSTR("REGISTER_ID_CD",1,1)='1' OR
SUBSTR("REGISTER_ID_CD",1,1)='2' OR SUBSTR("REGISTER_ID_CD",1,1)='3') AND "ROW_CURRENT_IND"='Y')
20 - filter("ROW_CURRENT_IND"='Y')
21 - filter("ROW_CURRENT_IND"='Y')
22 - filter("ROW_CURRENT_IND"='Y' AND "CONROL_REGISTER"='X')
ENVIRONMENT B
Plan hash value: 2826260434
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 181 | 103K (2)| 00:20:47 |
| 1 | HASH UNIQUE | | 1 | 181 | 103K (2)| 00:20:47 |
|* 2 | HASH JOIN ANTI | | 1 | 181 | 103K (2)| 00:20:47 |
|* 3 | HASH JOIN | | 1 | 176 | 56855 (2)| 00:11:23 |
|* 4 | HASH JOIN | | 1 | 163 | 36577 (2)| 00:07:19 |
|* 5 | TABLE ACCESS BY INDEX ROWID | ASSET | 1 | 44 | 4 (0)| 00:00:01 |
| 6 | NESTED LOOPS | | 1 | 131 | 9834 (2)| 00:01:59 |
| 7 | NESTED LOOPS | | 1 | 87 | 9830 (2)| 00:01:58 |
| 8 | NESTED LOOPS | | 1 | 74 | 9825 (2)| 00:01:58 |
|* 9 | HASH JOIN | | 1 | 52 | 9820 (2)| 00:01:58 |
|* 10 | TABLE ACCESS BY INDEX ROWID| METER_CONFIG_HEADER | 1 | 14 | 1 (0)| 00:00:01 |
| 11 | NESTED LOOPS | | 1 | 33 | 116 (2)| 00:00:02 |
|* 12 | TABLE ACCESS FULL | METER_CONFIG_ITEM | 1 | 19 | 115 (2)| 00:00:02 |
|* 13 | INDEX RANGE SCAN | SYS_C00116570 | 1 | | 1 (0)| 00:00:01 |
|* 14 | TABLE ACCESS FULL | SERVICE_DELIVERY_POINT | 723K| 13M| 9699 (2)| 00:01:57 |
|* 15 | TABLE ACCESS BY INDEX ROWID | NMI | 1 | 22 | 5 (0)| 00:00:01 |
|* 16 | INDEX RANGE SCAN | IDX_NMI_ID_NK | 2 | | 2 (0)| 00:00:01 |
|* 17 | TABLE ACCESS BY INDEX ROWID | SDP_LOGICAL_ASSET | 1 | 13 | 5 (0)| 00:00:01 |
|* 18 | INDEX RANGE SCAN | IDX_SLA_SDP_SK | 2 | | 2 (0)| 00:00:01 |
|* 19 | INDEX RANGE SCAN | IDX_A_SAPINTLOGDEV_SK | 2 | | 2 (0)| 00:00:01 |
|* 20 | TABLE ACCESS FULL | REGISTER | 76113 | 2378K| 26743 (2)| 00:05:21 |
|* 21 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 5095K| 63M| 20245 (2)| 00:04:03 |
| 22 | VIEW | | 90889 | 443K| 47021 (2)| 00:09:25 |
|* 23 | HASH JOIN | | 90889 | 2307K| 47021 (2)| 00:09:25 |
|* 24 | TABLE ACCESS FULL | REGISTER | 76113 | 966K| 26743 (2)| 00:05:21 |
|* 25 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 5095K| 63M| 20245 (2)| 00:04:03 |
Predicate Information (identified by operation id):
2 - access("SERVICE_DELIVERY_POINT_SK"="TMP"."SERVICE_DELIVERY_POINT_SK")
3 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK" AND
"SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
4 - access("ASSET_CD"="EQUIP_CD")
5 - filter("ROW_CURRENT_IND"='Y')
9 - access("NETWORK_TARIFF_CD"="NETWORK_TARIFF_CD")
10 - filter("ROW_CURRENT_IND"='Y')
12 - filter("ROW_CURRENT_IND"='Y' AND "CONROL_REGISTER"='X')
13 - access("METER_CONFIG_HEADER_SK"="METER_CONFIG_HEADER_SK")
14 - filter("ROW_CURRENT_IND"='Y')
15 - filter("ROW_CURRENT_IND"='Y' AND ("NMI_STATUS_CD"='A' OR "NMI_STATUS_CD"='D'))
16 - access("NMI_SK"="NMI_SK")
17 - filter("ROW_CURRENT_IND"='Y')
18 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
19 - access("SAP_INT_LOG_DEVICE_SK"="SAP_INT_LOG_DEVICE_SK")
20 - filter((SUBSTR("REGISTER_ID_CD",1,1)='4' OR SUBSTR("REGISTER_ID_CD",1,1)='5' OR
SUBSTR("REGISTER_ID_CD",1,1)='6') AND "REGISTER_TYPE_CD"='C' AND "ROW_CURRENT_IND"='Y')
21 - filter("ROW_CURRENT_IND"='Y')
23 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
24 - filter((SUBSTR("REGISTER_ID_CD",1,1)='1' OR SUBSTR("REGISTER_ID_CD",1,1)='2' OR
SUBSTR("REGISTER_ID_CD",1,1)='3') AND "REGISTER_TYPE_CD"='C' AND "ROW_CURRENT_IND"='Y')
25 - filter("ROW_CURRENT_IND"='Y')Edited by: abhilash173 on Feb 24, 2013 9:16 PM
Edited by: abhilash173 on Feb 24, 2013 9:18 PMHi Paul,
I misread your question initially .The system stats are outdated in both ( same result as seen from aux_stats) .I am not a DBA and do not have access to gather system stats fresh.
select * from sys.aux_stats$
SNAME PNAME PVAL1 PVAL2
SYSSTATS_INFO STATUS NULL COMPLETED
SYSSTATS_INFO DSTART NULL 02-16-2011 15:24
SYSSTATS_INFO DSTOP NULL 02-16-2011 15:24
SYSSTATS_INFO FLAGS 1 NULL
SYSSTATS_MAIN CPUSPEEDNW 1321.20523 NULL
SYSSTATS_MAIN IOSEEKTIM 10 NULL
SYSSTATS_MAIN IOTFRSPEED 4096 NULL
SYSSTATS_MAIN SREADTIM NULL NULL
SYSSTATS_MAIN MREADTIM NULL NULL
SYSSTATS_MAIN CPUSPEED NULL NULL
SYSSTATS_MAIN MBRC NULL NULL
SYSSTATS_MAIN MAXTHR NULL NULL
SYSSTATS_MAIN SLAVETHR NULL NULL -
Record all the data acquired by DAQ
Hi all,
I have a problem when I tried to record all the data acquired by the DAQ card. My DAQ is NI 6259, and the Labview version is 2009.
Basically, what I need do is using multiple channels to get the voltages of different point, then I subtract values of two channel (V+ and V-) to get voltage value between two points. In my vi, the sampling mode is continuous and the rate is 100 KHz, DAQmx reading is inside a while loop, and the numbers of samples per channel is set to be 100. After each loop execution I would get 100 data. I just used the format into file to write the data into a .txt file. It seems after 1 loop only the last one data is recorded instead of 100. Previously I averaged the 100 data inside the acquisition loop to get one data, so the writing function is OK but it really slows down the program. So I want to write all the raw data to txt file so later I can process them in Matlab. Is there any way to achieve this? Thank you very much.
The attached is the my vi for 2 channels voltage measurement and the current source control sub vi. I am new to Labview so the question really bothers me even after I read some similar topics in the forum.
Regards,
Hongxiang
Solved!
Go to Solution.
Attachments:
Dual Channel Voltage Measurement.vi 33 KB
Current Source 1.vi 49 KBThank you for you reply. I tried that, it actually record more data per pulse than before, but less pulses were recorded (there should be 8 pulses in 20 seconds time but only 2 were recorded), I think it was because the for loop inside slows down the program execution, and sometimes the error 200279 occurred. I think it would be better to separate the reading and recording process. Someone suggests me rather than writing to the file inside the loop, accumulate the data from all of the loop iterations to generate one large array containing all of the samples then pass that to Format Into File outside the loop. Do you know how to achieve that? I tried the as below, it seems totally wrong , only two data was written.
(The subvi nameed 1 (see attachment) converts the 2D array to 1D ) Thanks.
Attachments:
2D_Data_Reader.vi 9 KB -
How to run report to show all the records in the form view of Siebel 8.1.1
Hi.
My template has "for-each" section and there are more than 1 records in the form view. But when generating report, there is only one record in the report. Can anyone offers help? Thanks a lot.
Edited by: user11948585 on 2009-11-18 下午6:08Hi ,
Could you generate a sample XML from BIP Administrator for the IO on which report is based , and apply it to RTF and see if you are getting the desired values.
Thanks!! -
Revaluate data record at the time of loading from flat file or BI Cube
Hello Friends,
I want to revaluate a data record at time of loading using Transformation or Conversion File, based on some condition.
Like, I have a rule to identify that a record is supposed to be multiplied by -1 or not.
For example,
*if (ID(1:5) = str(00070) then(Record-1)
ID(1:5) = str(00071) then (Record-2)
Can you please guide me how can I achieve this by using Transformation file or Conversion file?
Regards,
Vishal.Hi Nilanjan,
Thanks for reply.
I tried the script you suggested in conversion file for Account.
But It is not working for me.
Even I tried simple multiplication and also addition in Formula column it is not working.
External --> *
Internal --> *
Formula ---> Value * -1
Above conversion file for Account was not working for me.
then I tried
Formula --> Value + 100
It also did not work for me.
Kindly suggest if I am doing anything wrong in above file.
Thanks,
Nilanjan. -
While creating a PDF document using Adobe Acrobat 9 Pro, it generates additional blank pages. How do I resolve this issue?
I might make a wild guess that you are using WORD and have used hard page breaks. If that is the case, then likely what happens is when you print to PDF the document is being reflowed to match the printer and the forced page break may end up on the next page and be forcing a second page. Just a guess.
-
Multiple info records for the same material record in the catalog
Hi Guys,
I have got sort of a puzzle I can not solve at the moment, maybe some one can help me to solve it. The situation in our system landscape is as follows:
Systems:
SRM 5.0 (Classic implementation scenario)
ECC 6.0
SRM-MDM 2.0
PI 2005
Material master is maintained in R/3 and the materials are replicated to SRM. From SRM we replicate all materials to the catalog repository in SRM-MDM.
As the material master in R/3 is vendor independent, the materials replicated to SRM also have no vendor attached. For the first initial material replication from SRM to the catalog SRM-MDM this also means that the products in the main table of the repository, u2018catalog positionsu2019, will not have a vendor or vendor number attached (i.e. no reference to a source of supply).
To assign a source of supply in the repository to a product (record) we will use info records which are replicated from R/3 to SRM-MDM. The mapping for this in the Import Manager can only be based on the product number. As this is the only unique value available which can map a info record to a already existing (but vendor less) ptoduct item in the catalog.
As long as every item in the repository has only one unique info record, meaning one source of supply with a specific price, there is no issue in the above scenario. Based on the product-id, the info record data will be matched with the material item in the catalog.
The issue for me arises when multiple info records exists for the same material. The info records can heave different vendors and different prices. All possible vendors should be available in the catalog for the same material (at least this is the requirement ).
At this point the product-id will no longer be an unique value. Importing the info records will cause problems, as only one material record is available in the repository for the specific product-id (remember that when doing a initial replication of the material master to the catalog, no vendor data is replicated).
Does anyone had this issue before, and knows a solution? Is it for example possible during the import of the info records in the Import manager, to duplicate material records in the destination data, based on the number of info records available in the source data for the same product-id. Or is there an other solution that I am missing?
Your help would be appreciated!
Regards,
SkanderDear Poster
Your thread has had no response since it's creation over
2 weeks ago, therefore, I recommend that you either:
- Rephrase the question.
- Provide additional Information to prompt a response.
- Close the thread if the answer is already known.
Thank you for your compliance in this regard.
Jason Boggans
SAP SRM SDN Moderator -
How to get Most current record in the reporting
Hi All,
I have reporting requiremnt like to get the most recent record.
Please see the below example we have 3 records but with 2 different sales represtentatives assigened to the same document.but i need to get most recently assigned sales representative record that is number 2(10726056 and status open).
Now i am geeting the 2 open records .Need to get only one.we do not have any other difference in the records to keep track of the sales representatives assignment to get the new sales rep id other than request id.
Can any one tell me most rcurrent data varibles under the request id do.Will it solves my issues o r any other ideas please welcome.
Customer No Sales rep Doc number Status Request id
0000613086 10726056 9000783660 C REQU_1
0000613086 10726056 9000783660 O REQU_1
0000613086 10182679 9000783660 O REQU_2Hi Sirisha,
it seems to be a problem of the kind "exception aggregation" , but let me say it can result in a very complex solution.
What about using a Virtual Infoprovider with servicecs to solve the issue?
You should create an additional InfoCube that reads form the basic one (where you have data) and then you should "delete" invalid records considering the request number.
By the way time stamp seems to be the easier solution.
Hope it helps
GFV -
How to generate individual records out of a date interval in SQL?
I have the following table in my db:
group varchar(3), start date, end date, value number
Obs.: The dates are in the format DD/MM/YYYY
This table has this single record:
'group1', 01/11/2007, 15/11/2007, 3
I need to query this table (using SQL) and generate the following result:
'group1', 01/11/2007, 3
'group1', 02/11/2007, 3
'group1', 03/11/2007, 3
'group ', 15/11/2007, 3
Does anybody know how I can produce this output?
Thank you
Message was edited by:
user596855> This table has this single record:
'group1', 01/11/2007, 15/11/2007, 3
If, at sometime in the future, this table contains more than one record, then the solutions above won't work and you might need a SQL statement like this:
SQL> create table mytable (groep,start_date,end_date,value)
2 as
3 select 'group1', date '2007-11-01', date '2007-11-15', 3 from dual union all
4 select 'group2', date '2007-11-10', date '2007-11-20', 88 from dual
5 /
Tabel is aangemaakt.
SQL> select groep
2 , day
3 , value
4 from mytable
5 model
6 partition by (groep,value)
7 dimension by (0 i)
8 measures (start_date day, end_date)
9 rules
10 ( day[for i from 1 to end_date[0] - day[0] increment 1] = day[0] + cv(i)
11 )
12 order by groep
13 , day
14 /
GROEP DAY VALUE
group1 01-11-2007 00:00:00 3
group1 02-11-2007 00:00:00 3
group1 03-11-2007 00:00:00 3
group1 04-11-2007 00:00:00 3
group1 05-11-2007 00:00:00 3
group1 06-11-2007 00:00:00 3
group1 07-11-2007 00:00:00 3
group1 08-11-2007 00:00:00 3
group1 09-11-2007 00:00:00 3
group1 10-11-2007 00:00:00 3
group1 11-11-2007 00:00:00 3
group1 12-11-2007 00:00:00 3
group1 13-11-2007 00:00:00 3
group1 14-11-2007 00:00:00 3
group1 15-11-2007 00:00:00 3
group2 10-11-2007 00:00:00 88
group2 11-11-2007 00:00:00 88
group2 12-11-2007 00:00:00 88
group2 13-11-2007 00:00:00 88
group2 14-11-2007 00:00:00 88
group2 15-11-2007 00:00:00 88
group2 16-11-2007 00:00:00 88
group2 17-11-2007 00:00:00 88
group2 18-11-2007 00:00:00 88
group2 19-11-2007 00:00:00 88
group2 20-11-2007 00:00:00 88
26 rijen zijn geselecteerd.Regards,
Rob. -
JAXB duplicates database records in the output XML file
I am trying to export a database through XML file using JAXB. But i get an XML file having my records with the @XMLElement name i gave it (SMS_Database) and also another one following it with <list/> as the RootElement name. I don't know where it's coming from. Here is the code:
import java.io.*;
import java.sql.*;
import java.util.ArrayList;
import javax.xml.bind.*;
public class Parse2Xml {
static final String XMLBASE = "./SMS_Database.xml";
static ArrayList<Intermed> dataList = new ArrayList<Intermed>();
static Connection con = null;
static PreparedStatement ps = null;
static ResultSet rs = null;
public static void main(String[] args) throws JAXBException, IOException {
con = getConnection();
try{
ps = con.prepareStatement("SELECT * FROM SMS_Log");
rs = ps.executeQuery();
while (rs.next()) {
dataList.add(getData(rs));
rs.close();
ps.close();
con.close();
} catch (SQLException ex) {
ex.printStackTrace();
DataStore SMS_Database = new DataStore();
SMS_Database.setList(dataList);
JAXBContext context = JAXBContext.newInstance(DataStore.class);
Marshaller m = context.createMarshaller();
m.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, Boolean.TRUE);
Writer w = null;
try {
w = new FileWriter(XMLBASE);
m.marshal(SMS_Database, w);
} finally {
try {
w.close();
} catch (Exception e) {
static Connection getConnection(){
String sqlURL = "jdbc:mysql://localhost:3306/SMSDB";
String username = "SUNNYBEN";
String password = "drowssap";
try {
try {
Class.forName("com.mysql.jdbc.Driver").newInstance();
} catch (InstantiationException ex) {
ex.printStackTrace();
} catch (IllegalAccessException ex) {
ex.printStackTrace();
} catch (ClassNotFoundException ex) {
ex.printStackTrace();
try {
con = DriverManager.getConnection(sqlURL, username, password);
} catch (SQLException ex) {
ex.printStackTrace();
return con;
static Intermed getData(ResultSet rs) throws SQLException {
Intermed mediator = new Intermed();
mediator.setSms_id(rs.getString("sms_id"));
mediator.setSender_id(rs.getString("sender_id"));
mediator.setMessage(rs.getString("message"));
mediator.setPhone_no(rs.getString("phone_no"));
mediator.setDate_sent(rs.getString("date_sent"));
mediator.setSchedule_date(rs.getString("schedule_date"));
mediator.setUsername(rs.getString("username"));
mediator.setResponse(rs.getString("response"));
return mediator;
import java.util.ArrayList;
import javax.xml.bind.annotation.XmlElement;
import javax.xml.bind.annotation.XmlRootElement;
@XmlRootElement(namespace = "SMS_Database")
class DataStore {
@XmlElement(name = "SMS_Log")
ArrayList<Intermed> dataList = new ArrayList<Intermed>();
public ArrayList<Intermed> getList() {
return dataList;
public void setList(ArrayList<Intermed> dataList) {
this.dataList = dataList;
import javax.xml.bind.annotation.*;
@XmlRootElement(name = "SMS_Log")
@XmlType(propOrder = {"sms_id", "sender_id", "message", "phone_no", "date_sent", "schedule_date", "username", "response"})
public class Intermed {
private String sms_id;
private String sender_id;
private String message;
private String phone_no;
private String date_sent;
private String schedule_date;
private String username;
private String response;
public String getSms_id() {
return sms_id;
public void setSms_id(String sms_id) {
this.sms_id = sms_id;
public String getSender_id() {
return sender_id;
public void setSender_id(String sender_id) {
this.sender_id = sender_id;
public String getMessage() {
return message;
public void setMessage(String message) {
this.message = message;
public String getPhone_no() {
return phone_no;
public void setPhone_no(String phone_no) {
this.phone_no = phone_no;
public String getDate_sent() {
return date_sent;
public void setDate_sent(String date_sent) {
this.date_sent = date_sent;
public String getSchedule_date() {
return schedule_date;
public void setSchedule_date(String schedule_date) {
this.schedule_date = schedule_date;
public String getUsername() {
return username;
public void setUsername(String username) {
this.username = username;
public String getResponse() {
return response;
public void setResponse(String response) {
this.response = response;
}Dear All
my requirement is to generate an xml file which looks similar to
<?xml version="1.0" encoding="UTF-8"?>
<?TestLine?>
<test_mt>
<field1>123</field1>
<field2>234</field2>
</test_mt>
how to add <?TestLine?> tag?
Hi shabrasish and rajashekar
i am new to java and xslt mapping can you guide me or give me links which are similar to my requirement how to proceed with this kind of mapping
thanks
uday
Maybe you are looking for
-
Macbook air won't allow me to install any torrents, what do i do?
I have been trying to install uTorrent and my Macbook Air fails to install, what do i need to do?
-
Which used iPhone (even discontinued) and how to check it for safety
I am happy with my 4s. In Italy you can have officially unlocked iPhones and Vodafone gives me 2 Sims with same number (master/slave). Now I have the slave in an WM HTC which I use as second telefone going in places like beach, or in unsafe places. O
-
I can't see some of images on my webpage.
www.sinookpark.com This is my webpage. I bought Domain and Hosting through Korea Hosting company. At first I had a problem to publish my webpage. so I added .htaccess file on my FTP. now I can see my webpage, but some of my images are not seen. image
-
Reg: Userid and password
Dear all Recently i did certification on SAP-SD. how can I get the password/userid to login into sap service marketplace ? Could u please help me! Regards Raj
-
Undocumented error when trying to copy azure database (worked fine until recently)
Until recently executing create database CB_Karakter_ACC as copy of jzv6wwe8gl.CB_Karakter_PRD on azure instance bycz1k0pmu worked fine. Today, I'm suddenly getting Msg 40197, Level 20, State 1, Line 65536 The service has encountered an error process