BW Data Testing
Hello
Hey can anybody help me with testing methodology documents or links that talk about data testing / unit testing in bw . My question is how do u make sure that data flowing in bw is correct. Just because its scheduled and nothing broken does not effectively ensure data correctness. So whats a comprehensive way of handling data quality.
thanks
Mark
hi,
hope this link help you
http://help.sap.com/saphelp_nw04s/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm
above link provide details about data flow in bi
monitoring link
http://help.sap.com/saphelp_nw70/helpdata/en/c2/c5e742be760121e10000000a155106/frameset.htm
if helpful provide points
regards
harikrishna.N
Similar Messages
-
Follow up question about interpreting smart data test on 160GB classic
I posted this question regarding the problem I am having with my 160GB ipod classic:
http://discussions.apple.com/thread.jspa?threadID=2153948&tstart=0
It appears noone is able to help me, so I have found some more information:
In short, itunes will not write to my 160GB ipod once it reaches about 35GB / 160. Windows will write to it, but only at 500KB/sec rather than the usual 19+Mb/sec. It just crawls once it hits that point. After following some advice online I came up with the following results using the HD tests in diagnostic mode.
smart data test:
retracts: 138
rallocs: 563
pending sector: 55
poweron hours: 1016
starts/stops: 267
temp: current 26C
temp: min 19C
temp: max 53C
and for the HDD spec test:
sno: (not sure if i should post a serial number?)
FW Revision: NB100-05
LBAs: 0x12a19eb0
These are the only tests that show up under the I/O HardDrive section. Should there be more? It seems like there might be from what I read, but I am not sure.
Anyway, any help interpreting these results would be greatly appreciated.adame wrote:
rallocs: 563
pending sector: 55
etc.
Your drive is probably toast. Compare your SmartData stats with mine:
Retracts: 844
*Reallocs: 12*
*Pending Sectors: 0*
PowerOn Hours: 2037
Starts/Stops: 830
Temp: Current 29c
Temp: Min 10c
Temp: Max 50C
I've only 12 remapped sectors and none pending. From the Wikipedia S.M.A.R.T. article:
*Reallocated Sectors Count*
Count of reallocated sectors. When the hard drive finds a read/write/verification error, it marks this sector as "reallocated" and transfers data to a special reserved area (spare area). This process is also known as remapping, and "reallocated" sectors are called remaps. This is why, on modern hard disks, "bad blocks" cannot be found while testing the surface – all bad blocks are hidden in reallocated sectors. However, as the number of reallocated sectors increases, the read/write speed tends to decrease. The raw value normally represents a count of the number of bad sectors that have been found and remapped. Thus, the higher the attribute value, the more sectors the drive has had to reallocate.
*Pending sector count*
Number of "unstable" sectors (waiting to be remapped, because of read errors). If an unstable sector is subsequently written or read successfully, this value is decreased and the sector is not remapped. Read errors on a sector will not remap the sector (since it might be readable later); instead, the drive firmware remembers that the sector needs to be remapped, and remaps it the next time it's written.
I'm sure you'll find the counts go up if you try to do a surface scanning disk check. Sorry it's not better news but at least you know where the problem lies. Assuming your iPod is out of warranty it's probably time to get your iPod refurbished with a new drive or to buy a new iPod.
tt2 -
In in debug mode ,i got path '/usr/sap/apdo/data/test.txt'
Could i physically find this txt file, and open it for now??
Where could i find the txt file???
Thanks.<b>AL11 will have the application server paths, goto AL11 you come to know he files, this is a Centralized library to store the Fiels
It shows the file path, also the Directory, File name and even the data in the file.</b>
To open your file: from t-code AL11:
Double click on Directory (Physical Path) then u will get all the Files in the Directory, Double click on the file name ,, u will get the content inside the file.
Reward points for all helpful answers.
Regards,
Moqeeth. -
What are the following files they are requesting in Mail.Dat TEM?
(1) Reconcialiation Report including Container information
(2) Qualification Report
(3) Version Summary Report
I am having problems figuring out what are these files?Hi Nash
What it means that the mail.dat you submit to PostalOne! should contain the information necessary to generate these documents.
For Example as mentioned in the TEM checklist if you submit an u2018original ready to payu2019 Mail.dat file (CSM Container status is u2018Ru2018 for original ready-to-pay) , you should be able to see the documents you mentioned on the dashboard.
Thanks
Anita -
Validating data/Test for dso
Hello Guys
I am writing a TD for a DSO and need to put Test conditions and validating procedures.
so what could i do to prove my dso is working fine.
1. Is this a true test to run extraction and check number of records in dso/psa equals source table or rsa3 or the extractor.
2. And how do u verify delta load. would that be number of records loaded during the delta load equal to the new records in source or what else can i do to verify delta load test
i kind of need this soon so please if u can help. i will assign points right away.
ThanksHello Mark,
1. You can check the number of records of the given selection in RSA3 in the source system.
2. Its not easy to find the new records as delta captures both new and modified records, but if you still want to find you can check the DSO change log table for NEW Record Flag.
Hope it helps
Thanks
Chandran -
How to generate test data for all the tables in oracle
I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
planning to implement something like
execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
schemaname = owner,
minrecinmstrtbl= minimum records to insert into each parent table,
minrecsforchildtable = minimum records to enter into each child table of a each master table;
all_tables where owner= schemaname;
all_tab_columns and all_constrains - where owner =schemaname;
using dbms_random pkg.
is anyone have better idea to do this.. is this functionality already there in oracle db?Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
There are two approaches you can take with this. I'll mention both and then ask which
one you think you would find most useful for your requirements.
One approach I would call the generic bottom-up approach which is the one I think you
are referring to.
This system is a generic test data generator. It isn't designed to generate data for any
particular existing table or application but is the general case solution.
Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
a. min length - the minimum length to generate
b. max length - the maximum length
c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
d. suffix - a suffix for the generated data; see prefix
e. whether to generate NULLs
3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
min/max scale.
4. store the attribute combinations in Oracle tables
5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
The second approach I have used more often. I would it call the top-down approach and I use
it when test data is needed for an existing system. The main use case here is to avoid
having to copy production data to QA, TEST or DEV environments.
QA people want to test with data that they are familiar with: names, companies, code values.
I've found they aren't often fond of random character strings for names of things.
The second approach I use for mature systems where there is already plenty of data to choose from.
It involves selecting subsets of data from each of the existing tables and saving that data in a
set of test tables. This data can then be used for regression testing and for automated unit testing of
existing functionality and functionality that is being developed.
QA can use data they are already familiar with and can test the application (GUI?) interface on that
data to see if they get the expected changes.
For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
1. DEPT_TEST_BEFORE
This table has all EMP table columns and a TEST_CASE column.
It holds EMP-image rows for each test case that show the row as it should look BEFORE the
test for that test case is performed.
CREATE TABLE DEPT_TEST_BEFORE
TESTCASE NUMBER,
DEPTNO NUMBER(2),
DNAME VARCHAR2(14 BYTE),
LOC VARCHAR2(13 BYTE)
2. DEPT_TEST_EXPECTED
This table also has all EMP table columns and a TEST_CASE column.
It holds EMP-image rows for each test case that show the row as it should look AFTER the
test for that test case is performed.
Each of these tables are a mirror image of the actual application table with one new column
added that contains a value representing the TESTCASE_NUMBER.
To create test case #3 identify or create the DEPT records you want to use for test case #3.
Insert these records into DEPT_TEST_BEFORE:
INSERT INTO DEPT_TEST_BEFORE
SELECT 3, D.* FROM DEPT D where DEPNO = 20
Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
look after test #3 is run. For example, if test #3 creates one new record add all the
records fro the BEFORE data set and add a new one for the new record.
When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
there is a foreign key betwee DEPT and EMP):
1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
DELETE FROM DEPT
WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
2. insert the test data set records for SCOTT.DEPT for test case #3.
INSERT INTO DEPT
SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
3 perform the test.
4. compare the actual results with the expected results.
This is done by a function that compares the records in DEPT with the records
in DEPT_TEST_EXPECTED for test #3.
I usually store these results in yet another table or just report them out.
5. Report out the differences.
This second approach uses data the users (QA) are already familiar with, is scaleable and
is easy to add new data that meets business requirements.
It is also easy to automatically generate the necessary tables and test setup/breakdown
using a table-driven metadata approach. Adding a new test table is as easy as calling
a stored procedure; the procedure can generate the DDL or create the actual tables needed
for the BEFORE and AFTER snapshots.
The main disadvantage is that existing data will almost never cover the corner cases.
But you can add data for these. By corner cases I mean data that defines the limits
for a data type: a VARCHAR2(30) name field should have at least one test record that
has a name that is 30 characters long.
Which of these approaches makes the most sense for you? -
HI
when i copy asp.net 3.5 application from IIS7 to IIS 8 in windows server 2012 and browse the application and try to save data in oracle , i get following error in Event viewer
Log Name: Application
Source: ASP.NET 4.0.30319.0
Date: 4/6/2015 2:53:21 PM
Event ID: 1309
Task Category: Web Event
Level: Warning
Keywords: Classic
User: N/A
Computer: TSharepint2013.test.com
Description:
Event code: 3005
Event message: An unhandled exception has occurred.
Event time: 17/06/36 02:53:21 ?
Event time (UTC): 17/06/36 11:53:21 ?
Event ID: 34f214e1dcdf45f5ad8450739c954494
Event sequence: 24
Event occurrence: 1
Event detail code: 0
Application information:
Process information:
Process ID: 18724
Process name: w3wp.exe
Account name: TEST\splaw
Exception information:
Exception type: InvalidOperationException
Exception message: The 'Microsoft.Jet.OLEDB.4.0' provider is not registered on the local machine.
at System.Data.OleDb.OleDbServicesWrapper.GetDataSource(OleDbConnectionString constr, DataSourceWrapper& datasrcWrapper)
at System.Data.OleDb.OleDbConnectionInternal..ctor(OleDbConnectionString constr, OleDbConnection connection)
at System.Data.OleDb.OleDbConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject)
at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup, DbConnectionOptions userOptions)
at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection)
at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
at System.Data.ProviderBase.DbConnectionInternal.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory)
at System.Data.OleDb.OleDbConnection.Open()
at printLawsuit.RetJudgement(Int32 x) in C:\inetpub\wwwroot\wss\VirtualDirectories\Law\ExecutionReq\printLawsuit.aspx.vb:line 2374
at printLawsuit.Page_Load(Object sender, EventArgs e) in C:\inetpub\wwwroot\wss\VirtualDirectories\Law\ExecutionReq\printLawsuit.aspx.vb:line 44
at System.Web.UI.Control.LoadRecursive()
at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
Request information:
Request URL: http://-----:50/executionreq/printLawsuit.aspx
Request path: /executionreq/printLawsuit.aspx
User host address: ---3.184
User:
Is authenticated: False
Authentication Type:
Thread account name: TEST\Spfarm
Thread information:
Thread ID: 16
Thread account name: TEST\Spfarm
Is impersonating: False
Stack trace: at System.Data.OleDb.OleDbServicesWrapper.GetDataSource(OleDbConnectionString constr, DataSourceWrapper& datasrcWrapper)
at System.Data.OleDb.OleDbConnectionInternal..ctor(OleDbConnectionString constr, OleDbConnection connection)
at System.Data.OleDb.OleDbConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject)
at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup, DbConnectionOptions userOptions)
at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection)
at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
at System.Data.ProviderBase.DbConnectionInternal.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory)
at System.Data.OleDb.OleDbConnection.Open()
at printLawsuit.RetJudgement(Int32 x) in C:\inetpub\wwwroot\wss\VirtualDirectories\Law\ExecutionReq\printLawsuit.aspx.vb:line 2374
at printLawsuit.Page_Load(Object sender, EventArgs e) in C:\inetpub\wwwroot\wss\VirtualDirectories\Law\ExecutionReq\printLawsuit.aspx.vb:line 44
at System.Web.UI.Control.LoadRecursive()
at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
Custom event details:
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
<System>
<Provider Name="ASP.NET 4.0.30319.0" />
<EventID Qualifiers="32768">1309</EventID>
<Level>3</Level>
<Task>3</Task>
<Keywords>0x80000000000000</Keywords>
<TimeCreated SystemTime="2015-04-06T11:53:21.000000000Z" />
<EventRecordID>174735</EventRecordID>
<Channel>Application</Channel>
<Computer>TSharepint2013.test.com</Computer>
<Security />
</System>
<EventData>
<Data>3005</Data>
<Data>An unhandled exception has occurred.</Data>
<Data>17/06/36 02:53:21 ?</Data>
<Data>17/06/36 11:53:21 ?</Data>
<Data>34f214e1dcdf45f5ad8450739c954494</Data>
<Data>24</Data>
<Data>1</Data>
<Data>0</Data>
<Data>/LM/W3SVC/3/ROOT/ExecutionReq-1-130727947211114471</Data>
<Data>Full</Data>
<Data>/ExecutionReq</Data>
<Data>C:\inetpub\wwwroot\wss\VirtualDirectories\Law\ExecutionReq\</Data>
<Data>TSHAREPINT2013</Data>
<Data>
</Data>
<Data>18724</Data>
<Data>w3wp.exe</Data>
<Data>TEST\splaw</Data>
<Data>InvalidOperationException</Data>
<Data>The 'Microsoft.Jet.OLEDB.4.0' provider is not registered on the local machine.
at System.Data.OleDb.OleDbServicesWrapper.GetDataSource(OleDbConnectionString constr, DataSourceWrapper& datasrcWrapper)
at System.Data.OleDb.OleDbConnectionInternal..ctor(OleDbConnectionString constr, OleDbConnection connection)
at System.Data.OleDb.OleDbConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject)
at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup, DbConnectionOptions userOptions)
at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection)
at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
at System.Data.ProviderBase.DbConnectionInternal.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory)
at System.Data.OleDb.OleDbConnection.Open()
at printLawsuit.RetJudgement(Int32 x) in C:\inetpub\wwwroot\wss\VirtualDirectories\Law\ExecutionReq\printLawsuit.aspx.vb:line 2374
at printLawsuit.Page_Load(Object sender, EventArgs e) in C:\inetpub\wwwroot\wss\VirtualDirectories\Law\ExecutionReq\printLawsuit.aspx.vb:line 44
at System.Web.UI.Control.LoadRecursive()
at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
</Data>
<Data>http://----:50/executionreq/printLawsuit.aspx</Data>
<Data>/executionreq/printLawsuit.aspx</Data>
<Data>10.169.3.184</Data>
<Data>
</Data>
<Data>False</Data>
<Data>
</Data>
<Data>TEST\Spfarm</Data>
<Data>16</Data>
<Data>TEST\Spfarm</Data>
<Data>False</Data>
<Data> at System.Data.OleDb.OleDbServicesWrapper.GetDataSource(OleDbConnectionString constr, DataSourceWrapper& datasrcWrapper)
at System.Data.OleDb.OleDbConnectionInternal..ctor(OleDbConnectionString constr, OleDbConnection connection)
at System.Data.OleDb.OleDbConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject)
at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection,
</Data>
</EventData>
</Event>
adilHi,
As this question is more relate to iis, I suggest you post it to IIS Forum, you will get more help and confirmed answers from there.
http://forums.iis.net/
Best regards
TechNet Community Support
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
[email protected] -
SSL test in Oracle Application Server 10.1.2
Hi ,
we are using one Oracle Application Server Web Cache 10.1.2 as our web server.This redirects users to the Application Server.(EBS 12.0.4)
Now we need to implement SSL to secure user credentials.So i am planing to implement SSL in our web cacahe server.Will this be enough to secure user credentials??I dont have any idea.Please help.
Moreover our web cache is not configured in SSL during the time of installation.Can any body suggest me a place from where i could get one demo Certificate to put it onto my web server and test the SSL confiration.EBS has provided one demo for application tier(inside $ORACLE_HOME/apache/certs/).
But we will have to implement in web cahe tier not in application tier.
We are using EBS 12.0.4 in Hp-Ux v11.23.Hi susmit;
Please chekc below note which could be helpful for your issue:
SSL Primer on Using OpenSSL as a Certificate Authority with E-Business Suite (with a Windows Example) [ID 1175193.1]
Enabling SSL in Release 12 [ID 376700.1]
R12.0.6+ : Oracle Application Object Library SSL Test Transaction Data Test [ID 732282.1]
R12.0.[3-4] : Oracle Application Object Library SSL Test Transaction Data Test [ID 564066.1]
Regard
Helios -
Dear Experts,
If somebody can help me by the following case, please give me some solution. Iu2019m working in a project BI 7.0 were needed to delete master data for an InfoObject material. The way that I took for this was through tcode u201CS14u201D. After that, I have tried to load again the master data, but the process was broken and the load done to half data.
This it is the error:
Second attempt to write record 'YY99993' to /BIC/PYYYY00006 failed
Message no. RSDMD218
Diagnosis
During the master data update, the master data tables are read to determine which records of the data package that was passed have to be inserted, updated, or modified. Some records are inserted in the master data table by a concurrently running request between reading the tables at the start of package processing and the actual record insertion at the end of package processing.
The master data update tries to overwrite the records inserted by the concurrently running process, but the database record modification returns an unexpected error.
Procedure
u2022 Check if the values of the master data record with the key specified in this message are updated correctly.
u2022 Run the RSRV master data test "Time Overlaps of Load Requests" and enter the current request to analyze which requests are running concurrently and may have affected the master data update process.
u2022 Re-schedule the master data load process to avoid such situations in future.
u2022 Read SAP note 668466 to get more information about master data update scheduling.
Other hand, the SID table in the master data product is empty.
Thanks for you well!
LuisDear Daya,
Thank for your help, but I was applied your suggesting. I sent to OSS with the following details:
We are on BI 7.0 (system ID DXX)
While loading Master Data for infoobject XXXX00001 (main characteristic in our system u2013 like material) we are facing the following error:
Yellow warning u201CSecond attempt to write record u20182.347.263u2019 to BIC/ XXXX00001 was successfulu201D
We are loading the Master data from data source ZD_BW_XXXXXXX (from APO system) through the DTP ZD_BW_XXXXX / XXX130 -> XXXX00001
The Master Data tables (S, P, X) are not updated properly.
The following reparing actions have been taken so far:
1. Delete all related transactional and master data, by checking all relation (tcode SLG1 à RSDMD, MD_DEL)
2. Follow instructions from OSS 632931 (tcode RSRV)
3. Run report RSDMD_CHECKPRG_ALL from tcode SE38 (using both check and repair options).
After deleting all data, the previous tests were ok, but once we load new master data, the same problem appears again, and the report RSDMD_CHECKPRG_ALL gives the following error.
u201CCharacteristic XXXX00001: error fund during this test.u201D
The RSRV check for u201CCompare sizes of P and X and/or Q and Y tables for characteristic XXXX00001u201D is shown below:
Characteristic XXXX00001: Table /BIC/ PXXXX00001, /BIC/ XXXXX00001 are not consistent 351.196 derivation.
It seems that our problem is described in OSS 1143433 (SP13), even if we already are in SP16.
Could somebody please help us, and let us know how to solve the problem?
Thank for all,
Luis -
How can i convert the date from M to MM ?
Dear Guru ,
I need to upload my list to SAP table , and in the list , we are using YYYY/M/D format ( Eg. 2010/5/20 , 2010/10/1 ) .
And now i want to convert all date format to YYYY/MM/DD , Is it possibile to do that ?
Here is my code , but it doesn't work . It returned "2009//3//5" format .
data: ld_date_int type datum.
data : test(10) type c.
test = '2009/3/5' .
ld_date_int = test .
WRITE : SY-SUBRC , LD_DATE_int .
Does SAP provide a standard function can convert the date format ?
Thanks .
Best Regards,
Carlos ZhangHi Dear
You can try in this way :::
data: ld_date_int type string.
DATA : ld_string TYPE string.
data : test(10) type c,
ld_res1(4) TYPE c,
ld_res2(2) TYPE c,
ld_res3(2) TYPE c.
DATA : ll_res2 TYPE i,
ll_res3 TYPE i.
test = '2009/03/5' .
ld_date_int = test .
ld_string = strlen( ld_date_int ).
CASE ld_string.
WHEN 10.
WRITE : SY-SUBRC , LD_DATE_int.
WHEN OTHERS.
SPLIT ld_date_int at '/' INTO ld_res1 ld_res2 ld_res3 in CHARACTER MODE.
ll_res2 = strlen( ld_res2 ).
ll_res3 = strlen( ld_res3 ).
IF NOT ll_res2 eq 2 and not ll_res3 eq 2.
CONCATENATE: '0' ld_res2 INTO ld_res2.
CONCATENATE: '0' ld_res3 INTO ld_res3.
CONCATENATE ld_res1 '/' ld_res2 '/' ld_res3 INTO ld_date_int.
WRITE : SY-SUBRC , LD_DATE_int.
ENDIF.
IF ll_res2 eq 2 and not ll_res3 eq 2.
CONCATENATE '0' ld_res3 INTO ld_res3.
CONCATENATE ld_res1 '/' ld_res2 '/' ld_res3 INTO ld_date_int.
WRITE : SY-SUBRC , LD_DATE_int.
ENDIF.
IF NOT ll_res2 eq 2 and ll_res3 eq 2.
CONCATENATE: '0' ld_res2 INTO ld_res2.
CONCATENATE ld_res1 '/' ld_res2 '/' ld_res3 INTO ld_date_int.
WRITE : SY-SUBRC , LD_DATE_int.
ENDIF.
ENDCASE. -
Restoring Data after Hard Drive Replacement
My hard drive just passed away (luckily 2 weeks before my applecare ran out) and am unsure the best way to restore my data. I made a carbon copy clone of my hard drive before I took it in and thought I could just boot from that and copy over my data but I can't seem to select the drive as a bootable option (its a western digital 120g firewire and usb drive. Do i need to go through the initial os x set up before I can do this. Also I am debating using this as an chance to do some house keeping. If I decide not to reclone my hard drive can I just copy over old apps like word or do I need to reinstall them. And if my user name and nickname are the same as before will it cause any problems if I decide later to restore my hard drive from the clone copy?
Thank you for any helpJonathan:
1. First, you must format and zero out the new HDD. This spares out bad blocks, and preparest your HDD for installation.
2. Then I suggest doing a fresh install of the system software.
3. You can then restore your Users Folder.
4. And, finally, some applications can be copied by drag and drop. I think Word is one of those.
1. Formatting, Partitioning Zeroing a Hard Disk Drive
Warning! This procedure will destroy all data on your Hard Disk Drive. Be sure you have an up-to-date, tested backup of at least your Users folder and any third party applications you do not want to re-install before attempting this procedure.
Boot from the install CD holding down the "C" key.
Select language
Go to the Utilities menu (Tiger) Installer menu (Panther & earlier) and launch Disk Utility.
Select your HDD (manufacturer ID) in left side bar.
Select Partition tab in main panel. (You are about to create a single partition volume.)
Select number of partition in pull-down menu above Volume diagram.
(Note: 1 partition is normally better for an internal HDD.)
Type in name in Name field (usually Macintosh HD)
Select Volume Format as Mac OS Extended (Journaled)
Click Partition button at bottom of panel.
Select Erase tab
Select the a volume under Manufacturer ID (usually Macintosh HD).
Check to be sure your Volume Name and Volume Format are correct.
Select on Security Options button (Tiger) Options button (Panther & earlier).
Select Zero all data. (This process will map out bad blocks on your HDD. However, it could take several hours. If you want a quicker method, don't go to Security Options and just click the Erase button.)
Click OK.
Click Erase button
Quit Disk Utility.
2. Install System Software
Open installer and begin installation process.
Choose to Customize and deselect Foreign Language Translations and Additional Printer drivers.
Check box to install X11.
Proceed with installation.
After installation computer will restart for setup.
After setup, reboot computer.
Go to Applications > Utilities > Disk Utility.
Select your HDD (manufacturer ID) in left side bar.
Select First Aid in main panel.
Click Repair Disk Permissions.
Connect to Internet.
Go to Apple Menu > Software Update.
Install all updates.
Computer will restart after updates.
Go to Applications > Utilities > Disk Utility.
Select your HDD (manufacturer ID) in left side bar.
Select First Aid in main panel.
Click Repair Disk Permissions.
3. Copying Over an Account to a different Volume
Copying an Account directly by dragging or by cloning will not change the NetInfo information needed for the Account to be properly recognized.
To properly set up NetInfo
*Go to Apple Menu > System Preferences > Accounts
*Click on padlock and authenticate
*Click on plus sign under Login Options to add a new account.
*Set up an account and check box to allow user to administer.
*Log out of current account and log in to new Admin account
*Connect computer (Host) to computer with account to be transferred.
*If you are able boot into Firewire you can do that and use SuperDuper to clone only the Users Folder, or FW or USB go to next step
* Drag Home folder from backup drive to Users Folder on new drive and rename it. (If you get an error message about permissions when doing the copy, click button to Authenticate and enter your password; alternatively, you can turn permissions off for the source disk in the Get Info window.)
*Switch names between copied Folder and Home Folder (with house icon) by putting 2 after Home Folder name, and naming newly copied Folder with account name (original home folder name).
*To see that copied folder is owned by the account that is logged in select folder and Get Info (Command + i), then go to Ownership & Permissions.
*Log out of account by going to the Apple Menu and selecting Log out Account Name (Command + Shift + Q)
*Log back in to account.
*If the account was successfully copied, you may uncheck Allow User to Administer option in Apple Menu/System Preferences/Accounts if you so choose.
adapted from a solution by Niel
Please post back with further questions, comments or update.
Good luck.
cornelius -
Using Field symbol to fetch data from another program
I have a requirement where i need to fetch value of a field from one of the function pool so i have written a sample progam to check the logic but its doest seems to work its giving dump.
Basically i want to know how to use <programname><fieldname> with fieldsymbol.
REPORT ztest1.
DATA test(25) VALUE 'ggg'.
submit ztest.
report ztest.
constants: lv_memory(25) type c value '(ZTEST1)TEST'.
field-symbols: <fs2> type char25 .
assign (lv_memory) to <fs2>.
write : <fs2>.
I am getting same field symbol assignment dumpHi Rahul,
You can use this concept in between the FUNCTION MODULES where both are
under same FUNCTION GROUP ( as both the FM's have same global memory area ).
And also if you are calling an FM or method from your program ,you can have the data of the calling program in that called FM or METHOD.
Hope this may help you.
Regards,
Smart Varghese -
ISE Fail OPEN configuration/testing
Greetings,
We will be performing a live test of ISE Fail Open on our production system tomorrow night. When the policy nodes are all unavailable we want the switches to allow open access to all devices on all interfaces.
I have done some testing of this on an individual test switch by routing packets to the ISE policy nodes to null 0 to emulate a failure. It appears to be working well, but was hoping for more input from the community before my Live test tomorrow night.
First, I believe these to be the only commands needed to make this work correctly. Does anyone have any comment on this configuration? Am I missing anything? Do these timers seem OK? I'm wondering if the deadtime should be greater in case the nodes or the network connection are flapping?
Global Config:
radius-server dead-criteria time 5 tries 3
radius-server deadtime 5
dot1x critical eapol
Interface Config:
authentication event server dead action reinitialize vlan <normal data vlan>
authentication event server dead action authorize voice
authentication event server alive action reinitialize
Next, this is the behavior I am seeing after the policy nodes go down. Is this as it should be?
1. Absolutely nothing happens until an interface undergoes (re)authentication. All ports remain in current authentication/authorization state.
2. If an interface undergoes (re)authentication, the switch tries to reach one of the configured policy nodes. After 5 seconds there is a message the first node is dead. In another 5 seconds there is a mesage that the second node is dead.
3. After another ~20 seconds, the interface that was attempting (re)authentication goes into Critical Authorization:
TEST#sh auth sess int f1
Interface: FastEthernet1
MAC Address: 1234.5678.90ab
IP Address: Unknown
User-Name: UserName
Status: Authz Success
Domain: DATA
Oper host mode: multi-host
Oper control dir: in
Authorized By: Critical Auth
Vlan Policy: 2
Session timeout: N/A
Idle timeout: N/A
Common Session ID: 0A0A010B0000013D093F17CC
Acct Session ID: 0x0000072B
Handle: 0x5A00013E
Runnable methods list:
Method State
dot1x Authc Failed
mab Not run
Critical Authorization is in effect for domain(s) DATA
TEST#
All other interfaces remain in current mode, nothing on them changes so long as they don't attempt to (re)authenticate.
4. If another interface attempts to (re)authenticate, it goes into critical state immediately w/o trying to contact the dead policy nodes.
5. The switch will try every so often (every 5 minutes?) to reach the policy nodes. If one of them is up, all interfaces that were in critical state immediately transition to normal authc/authz modes. Normal timers apply, dot1x endpoints come up almost immediately, mab clients lose connectivity until dot1x times out.
To emulate a global fail for the organization, I plan to stop the ISE services on both of my policy nodes.
Thanks for any comments/insights/input.We appreciate the detailed scenario description, the question itself was very informative.
I used
authentication event server dead action authorize
critical VLAN=accessVLAN
instead of
authentication event server dead action reinitialize vlan -
Failed JCO destination name 'WD_RFC_METADATA_DEST' and MODEL DATA
Hi Friends
I am created "WD_RFC_METADATA_DEST". this meta data.When i am created this metadata i was called message server as technical system of CRM server.
means i am created metadata for CRM System
Once metadata had complted then i was cheing click on "Text"
Here it display this error message
Model Data test
com.sap.mw.jco.JCO$Exception: (102) RFC_ERROR_COMMUNICATION: Connect to message server host failed Connect_PM TYPE=B MSHOST=ecc15 GROUP=PUBLIC R3NAME=E15 MSSERV=sapmsE15 PCS=1 ERROR Group PUBLIC not found TIME Fri Feb 19 01:12:26 2010 RELEASE 700 COMPONENT LG VERSION 5 RC -6 MODULE lgxx.c LINE 4299 DETAIL LgIGroupX COUNTER 1
Meta data test
com.sap.mw.jco.JCO$Exception: (102) RFC_ERROR_COMMUNICATION: Connect to message server host failed Connect_PM TYPE=B MSHOST=ecc15 GROUP=PUBLIC R3NAME=E15 MSSERV=sapmsE15 PCS=1 ERROR Group PUBLIC not found TIME Fri Feb 19 01:13:19 2010 RELEASE 700 COMPONENT LG VERSION 5 RC -6 MODULE lgxx.c LINE 4299 DETAIL LgIGroupX COUNTER 1
What is problem. i m not getting can u tell me how slove this problem.
Regards
Vijay KalluriHi Vijay,
You need check following thing to resolve this issue.
1. Check the your host file entries for CRM System (Start>Run>enter 'drivers'>etc>hosts)
2. Check SLD test is successfull or not.
3. As per your error message group PUBLIC not found. Check in the CRM system (T-code : SMLG) whether PUBLIC group is available or not.
Then check JCO destination parameter again and test it.
Hope this will helps you.
Thanks
Arun -
Facing Problem in parsing a string to date
Hi,
I was trying to change a string into date with date format ("EEEE,MMM,d,h:mm") but I always get the year as 1970.
here is my code
String strDate="Saturday,Jan 19 7:31";
String dateFormat3="EEEE,MMM,d,h:mm";
try {
DateFormat myDateFormat = new SimpleDateFormat(dateFormat3);
result1=myDateFormat.parse(strDate);
catch(ParseException pe) {
System.out.println("ERROR: could not parse date in string \"" +
}any solution for it.This is my actual code
import java.text.DateFormat;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Locale;
public class TestingDate {
* @param args
public static void main(String[] args) {
// TODO Auto-generated method stub
String dateFormat="EEEE, MMM d h:mm a";
Date test=new Date(2007,0,19, 19, 31);
System.out.println(" original date is "+test);
String stringResult=DateToString(test,dateFormat);
System.out.println("Date to string is "+stringResult);
Date dateResult=stringToDate(stringResult,dateFormat);
System.out.println(" String to date is "+dateResult);
String stringResult2=DateToString(dateResult,dateFormat);
System.out.println(" Date to string is "+stringResult2);
public static String DateToString(Date test, String dateFormat) {
String result = null;
try {
DateFormat myDateFormat = new SimpleDateFormat(dateFormat);
result = myDateFormat.format(test);
//System.out.println(" reslut date is "+result);
} catch (Exception e) {
System.out.println(" Exception is "+e);
return result;
public static Date stringToDate(String strDate,String dateFormat1){
Date result1=null;
try {
DateFormat myDateFormat = new SimpleDateFormat(dateFormat1);
result1=myDateFormat.parse(strDate);
catch(Exception e){
System.out.println(" exception is "+e);
return result1;
}I am facing problem in getting the actual date. Please suggest the solution.
Maybe you are looking for
-
Auto mailers in recruiter's page ( E - Recruitment )
Hi , I have one client requirement. In the format of sending interview invite to candidate the format of the letter should have a check box or radio button of attending , by which he can confirm the recruiter that he is attending the interview. Basi
-
What is Liveshare and why has it appeared after the update to v15?
After updating to firefox 15 the icon for Trusteer Rapport disappeared from the address bar (I presume this is just a matter of waiting for an update from them) and was replaced by an icon for something called Liveshare - what is this and why is it t
-
Validating Internal Order Settlement Rule during creation
Does anybody know of a user exit, badi, anything, that can be used to validate assignement data when creating/changing the settlement rule on an internal order? Appreciate the help... Sam
-
OLE container property in Forms 6i through Designer
Hi, I'm creating forms through Designer. By default the OLE container In place activation property is set to Yes. I want that to be set to No. I don't see any way for property setting of OLE through Designer. Can I use SET_ITEM_PROPERTY built-in? Ple
-
Autocomplete Date in a textfield
How can I achieve autocompletion of dates in a textfield. If somebody inserts say 12 the backslash character automatically appears "/". Can somebody helped me out.